Issue with qDebug() showing only the first character of a QString
Hi i'm trying to convert a Qstring to double but i only receive a zero.
Code:
std::string message(usb_data.constData(), 60); //usb_data.length()
QString latitude
= qmessage.
mid(0,
21);
QString longitude
= qmessage.
mid(23,
25);
QString lat_mid
= qmessage.
mid(0,
18);
QString lon_mid
= qmessage.
mid(23,
21);
double lat = lat_mid.toDouble();
double lon = lon_mid.toDouble();
qDebug() << lat << latitude;
my output is 0 "4
Re: Issue with qDebug() showing only the first character of a QString
So when you look at the contents of message, qmessage, latitude, longitude, height, and the others in the debugger, what do you see? How do you know that usb_data is ASCII and not binary? What does usb_data look like in the debugger? How do you know that usb_data is actually 60 bytes long? How do you know that qmessage is long enough to contain all of the characters you are trying to split out of it?
Re: Issue with qDebug() showing only the first character of a QString
The data from the usb_data are NMEA data so it's ascii. The usb_data on debugger is "4 where it supposed to be a line of numbers. If i use qDebug() << lat_mid.toLatin1().toHex();
the output is “34003000340035002e003800390032003100†but if i use qDebug() << lat_mid.toLatin1(); the output is “4. I know that the bytes are 60 byte long because i'm
sending them from a uC. And i'm using a Qtextedit to show these variables, something like a console.
On the console i'm seeing
latitude 40.705465,E
longitude -74.026566,S
lat_mid 40.705465
lon_mid -74.026566
but the qDebug is showing only the "4 on the latitude variable. There must be non-printable characters because what i'm trying to do is to convert a string to double.
Re: Issue with qDebug() showing only the first character of a QString
“34003000340035002e003800390032003100†in hex clearly indicates this is not ascii data as the second byte is null (0x00). Considering the fact that every second byte of the string seems to be 0x00, I would guess you have utf-16 encoded data, similar to what is stored internally in QString. So I'm guessing that either your data was never ASCII or it was but then you converted it to something that wasn't ASCII anymore and then converted it to binary data in an incorrect way.
Re: Issue with qDebug() showing only the first character of a QString
This is the flow of my program.
Code:
void HID_PnP::PollUSB()
{
buf[0] = 0x00;
memset((void*)&buf[2], 0x00, sizeof(buf) - 2);
if (isConnected == false) {
device = hid_open(0x04D8, 0x003F, NULL);
if (device) {
isConnected = true;
hid_set_nonblocking(device, true);
timer->start(1000); //check every two seconds
}
}
else {
if (hid_write(device, buf, sizeof(buf)) == -1)
{
CloseDevice();
return;
}
if(hid_read(device, buf, sizeof(buf)) == -1)
{
CloseDevice();
return;
}
}
QByteArray usb_data
(reinterpret_cast<char
*>
(buf
),
sizeof(buf
));
hid_comm_update(usb_data, isConnected);
}
Code:
{
ui->setupUi(this);
plugNPlay = new HID_PnP();
connect(plugNPlay,
SIGNAL(hid_comm_update
(QByteArray,
bool)),
this,
SLOT(update_gui
(QByteArray,
bool)));
Code:
void Form
::update_gui(QByteArray usb_data,
bool isConnected
) {
if(isConnected){
std::string message(usb_data.constData(), 60); //usb_data.length()
QString latitude
= qmessage.
mid(0,
21);
QString longitude
= qmessage.
mid(23,
25);
QString lat_mid
= qmessage.
mid(0,
18);
QString lon_mid
= qmessage.
mid(23,
21);
double lat = lat_mid.toDouble();
double lon = lon_mid.toDouble();
qDebug() << usb_data;
ui
->textEdit
->append
("| " + QTime::currentTime().
toString("<i>hh:mm:ss</i>") + " | <b>Latitude:</b> " + latitude + " <b>Longtitude:</b> " +longitude + " <b>Height:</b> " + height);
I need the lat and lon variables to store a double acquired from a string.
Re: Issue with qDebug() showing only the first character of a QString
Quote:
“34003000340035002e003800390032003100â€
If I treat this as 16-bit ASCII (i.e. ignore the 00 characters), this translates to "4045.8921" and it is only 36 bytes long, not 60.
As wysota said, this looks like 16-bit ASCII, not 8-bit char. std::string is for 8-bit char, std::wstring is for 16-bit wchar. So try replacing std::string in line 5 above with std::wstring, and replace QString::fromStdString() with QString::fromStdWString() in line 6.
Re: Issue with qDebug() showing only the first character of a QString
It was indeed 16bit ascii. I used
Code:
usb_data
= QString::fromUtf16(reinterpret_cast<const
unsigned short*>
(buf
),
sizeof(buf
));
and now it works!!! :D