PDA

View Full Version : Issue with qDebug() showing only the first character of a QString



Qg90
25th August 2014, 22:29
Hi i'm trying to convert a Qstring to double but i only receive a zero.


std::string message(usb_data.constData(), 60); //usb_data.length()
QString qmessage = QString::fromStdString(message);
QString latitude = qmessage.mid(0,21);
QString longitude = qmessage.mid(23,25);
QString height = qmessage.mid(50);

QString lat_mid = qmessage.mid(0,18);
QString lon_mid = qmessage.mid(23,21);




double lat = lat_mid.toDouble();
double lon = lon_mid.toDouble();

qDebug() << lat << latitude;

my output is 0 "4

d_stranz
26th August 2014, 03:11
So when you look at the contents of message, qmessage, latitude, longitude, height, and the others in the debugger, what do you see? How do you know that usb_data is ASCII and not binary? What does usb_data look like in the debugger? How do you know that usb_data is actually 60 bytes long? How do you know that qmessage is long enough to contain all of the characters you are trying to split out of it?

Qg90
26th August 2014, 13:28
The data from the usb_data are NMEA data so it's ascii. The usb_data on debugger is "4 where it supposed to be a line of numbers. If i use qDebug() << lat_mid.toLatin1().toHex();
the output is “34003000340035002e003800390032003100” but if i use qDebug() << lat_mid.toLatin1(); the output is “4. I know that the bytes are 60 byte long because i'm
sending them from a uC. And i'm using a Qtextedit to show these variables, something like a console.

On the console i'm seeing
latitude 40.705465,E
longitude -74.026566,S
lat_mid 40.705465
lon_mid -74.026566


but the qDebug is showing only the "4 on the latitude variable. There must be non-printable characters because what i'm trying to do is to convert a string to double.

wysota
26th August 2014, 14:41
“34003000340035002e003800390032003100” in hex clearly indicates this is not ascii data as the second byte is null (0x00). Considering the fact that every second byte of the string seems to be 0x00, I would guess you have utf-16 encoded data, similar to what is stored internally in QString. So I'm guessing that either your data was never ASCII or it was but then you converted it to something that wasn't ASCII anymore and then converted it to binary data in an incorrect way.

Qg90
26th August 2014, 14:55
This is the flow of my program.


void HID_PnP::PollUSB()
{
buf[0] = 0x00;
memset((void*)&buf[2], 0x00, sizeof(buf) - 2);

if (isConnected == false) {
device = hid_open(0x04D8, 0x003F, NULL);

if (device) {
isConnected = true;
hid_set_nonblocking(device, true);
timer->start(1000); //check every two seconds
}
}
else {
if (hid_write(device, buf, sizeof(buf)) == -1)
{
CloseDevice();
return;
}
if(hid_read(device, buf, sizeof(buf)) == -1)
{
CloseDevice();
return;
}

}
QByteArray usb_data(reinterpret_cast<char*>(buf), sizeof(buf));
hid_comm_update(usb_data, isConnected);
}



Form::Form(QWidget *parent) : QWidget(parent), ui(new Ui::Form)
{

ui->setupUi(this);
plugNPlay = new HID_PnP();

connect(plugNPlay, SIGNAL(hid_comm_update(QByteArray,bool)), this, SLOT(update_gui(QByteArray,bool)));


void Form::update_gui(QByteArray usb_data, bool isConnected)
{

if(isConnected){
std::string message(usb_data.constData(), 60); //usb_data.length()
QString qmessage = QString::fromStdString(message);
QString latitude = qmessage.mid(0,21);
QString longitude = qmessage.mid(23,25);
QString height = qmessage.mid(50);

QString lat_mid = qmessage.mid(0,18);
QString lon_mid = qmessage.mid(23,21);




double lat = lat_mid.toDouble();
double lon = lon_mid.toDouble();

qDebug() << usb_data;

ui->textEdit->append("| " + QTime::currentTime().toString("<i>hh:mm:ss</i>") + " | <b>Latitude:</b> "
+ latitude + " <b>Longtitude:</b> " +longitude + " <b>Height:</b> " + height);

I need the lat and lon variables to store a double acquired from a string.

d_stranz
26th August 2014, 21:32
“34003000340035002e003800390032003100”

If I treat this as 16-bit ASCII (i.e. ignore the 00 characters), this translates to "4045.8921" and it is only 36 bytes long, not 60.

As wysota said, this looks like 16-bit ASCII, not 8-bit char. std::string is for 8-bit char, std::wstring is for 16-bit wchar. So try replacing std::string in line 5 above with std::wstring, and replace QString::fromStdString() with QString::fromStdWString() in line 6.

Qg90
26th August 2014, 22:50
It was indeed 16bit ascii. I used
usb_data = QString::fromUtf16(reinterpret_cast<const unsigned short*>(buf), sizeof(buf)); and now it works!!! :D