PDA

View Full Version : how can get all links with QT from a web page?



ChineseAlexander
15th April 2009, 09:42
I do not need a window,but I hope it can parse some javascript written URL for too many web page give out the URL not so directly.

How should I do?

wysota
15th April 2009, 10:09
You can parse anything you like with QString and QRegExp.

ChineseAlexander
15th April 2009, 10:24
Thanks, but my problem is that, it seems not so easy to deal with some javascript which make the full real url.

like below:

<a href='javascript:get_url()'>something</a>


by the way, is there any easy way to retrive a page content source?

like
QSomeClass q("http://google.com");
q.GetText()
??

wysota
15th April 2009, 11:05
We are touching multiple subjects here. Please provide details on what you want to achieve and what you have already tried. If you want contents of a remote page, use QNetworkAccessManager or QHttp to download it. You can execute javascript code from a webpage but you have to extract it first from the page and then feed it to QScriptEngine.

ChineseAlexander
17th April 2009, 06:41
We are touching multiple subjects here. Please provide details on what you want to achieve and what you have already tried. If you want contents of a remote page, use QNetworkAccessManager or QHttp to download it. You can execute javascript code from a webpage but you have to extract it first from the page and then feed it to QScriptEngine.


hello wysota,

I have done some work about retrive links from a web page with your hint,use QHttp & QRegExp.But it lays still a big problem is how to retrive some URL which is composite with javascript. The QScriptEngine is not so easy to use.

Some days ago,I may see a interface function like xxxURLs() which may be contained in QWebKit Classes.But now,I just can't find it again. Is it a memery mistake?

Do any one has any idea about how to Get All links completely?

Thanks a lot,