FRIHOSTFORUMSSEARCHFAQTOSBLOGSCOMPETITIONS
You are invited to Log in or Register a free Frihost Account!


Howto retrieve info from other websites ?





Ecthelion
Hi,

just something I was wondering, is it possible to retrieve specific info from another website to use on your own, using php (or something alike)

For example: there is this website that has some great stats that I would like to use. I would also like these stats on my forum. Is it possible to write something to get this specific piece of information on that other website?
Or even just take a picture of some content on that other website, and display it...?

Just a small window that gives a view on these stats, that's what I want.
I don't mind if ppl see it comes from that other website, and the website owner, who doesn't want to give the code for these stats, doesn't mind other people using them.

Is there anything like this possible, or not?
Kinda say to your forum, fetch info at that exact location on that other website, and display it here...?

I'm sorry if I can't explain exactly what I'l looking for, but I don't know. I just want to give a view on some information that update frequently and from which I don't have the code...
coeus
PHP code is done server side so your normal 'view source' methods won't work, not sure if there is any other way to see the websites code. That said if you post a link to the page here and point out the features you are interested in I am sure we can figure out how to do them.
Ecthelion
http://clans.tremulous.net/?p=info&id=44

The FSN team is the team I'm part of. It's also for this team that I use my hosting.
(http://teamfsn.frih.net/phpBB2/)

What I would like to have it only this:

kv
It is quite easy. use file_get_contents to retrieve the html source of the url.

Code:
$filedata = file_get_contents("http://clans.tremulous.net/?p=info&id=44 ");


Then use string functions to extract the data you need from the $filedata.
Ecthelion
I'll try that.
It'll probably take a while before I get how this works, but at least it sounds like a plan.
Hopefully the other site doens't change his layout then, or I'll have a problem.

Thanks for the tip!
JayBee
You can use something like this, or play with preg/ereg.

Code:
<?php
$text = file_get_contents("http://clans.tremulous.net/?p=info&id=44");
// find first fosition of start tag
$start = strpos($text,'<fieldset class="players">');
// find first position of close tag after start tag
$length = strpos($text,'</fieldset>',$start) - $start;
// echo a part of page
echo substr($text,$start,$length);
?>
Ecthelion
Hey thanks!

you did all the work for me!

This gives exactly what I needed!

Well that's something new I learned Smile

Thanks for the help, all of you !
manum
dude i suggest using cURL.... search on google for cURL tutorials....m sure u'll love it
Fire Boar
I wouldn't. Why use funky includes when you've got all you need right there in 4 lines? cURL is good for if you want to do all sorts of things with different doobries, but if you only want this one thing it's best to just use the code posted above.

EDIT: Oh, and the great thing about that site is it's properly formatted. So the HTML is probably not going to change at all, even if the website gets completely redesigned. In other words, you can be sure that the code you'll need in PHP won't change.
thinkingskull
use curl it's damn easy....

this example will fetch yahoo.com into yahoo.txt on ur server....

I guess Frih.net accounts support cUrl 2 some extent....

Code:

<?php

$ch = curl_init("http://www.yahoo.com/");
$fp = fopen("yahoo.txt", "w");

curl_setopt($ch, CURLOPT_FILE, $fp);
curl_setopt($ch, CURLOPT_HEADER, 0);

curl_exec($ch);
curl_close($ch);
fclose($fp);
?>

edit by rvec: please use the code tags
JayBee
YES Very Happy it's damn easy

Code:
$text = file_get_contents("http://yahoo.com/");

vs.
Code:
$ch = curl_init("http://www.yahoo.com/");
$fp = fopen("yahoo.txt", "w");

curl_setopt($ch, CURLOPT_FILE, $fp);
curl_setopt($ch, CURLOPT_HEADER, 0);

curl_exec($ch);
curl_close($ch);

fclose($fp);

// TODO: read yahoo.txt contents because you still don't have anything ;)
// for example by this code :-P
// $text = file_get_contents("yahoo.txt");
Dougie1
cURL is much faster to use but file() gets the page into an array which is easier to split up.

For some examples go here.
manav
cURL is definitely an excellent option

I while learning about it had actually written a script that will log into rediffmail.com account and extract all the emails and contacts in the addressbook

This script is generally used by many social networking websites and is an excellent example of getting stuff from other websites and displaying them on your own

Code:
<?php
$cookie_jar = "manav.txt";
$username = "Your Rediffmail username";
$password = "Your Rediffmail password";
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL,"http://mail.rediff.com/cgi-bin/login.cgi");
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, 1);
curl_setopt($ch, CURLOPT_POST, 1);
curl_setopt($ch, CURLOPT_COOKIEJAR, $cookie_jar);
curl_setopt ($ch, CURLOPT_USERAGENT, "Mozilla/4.0 (compatible; MSIE 5.01; Windows NT 5.0)");
curl_setopt($ch, CURLOPT_POSTFIELDS, "FormName=existing&login=$username&passwd=$password&proceed=GO");
$html = curl_exec($ch);
$man=strpos($html,"function do_logout()");
$html = substr_replace($html,'',0,$man);
$m = strpos($html,"session_id=");
$m = $m+11;
$sessionidm = substr($html,$m,32);
curl_setopt($ch, CURLOPT_URL,"http://f4mail.rediff.com/bn/address.cgi?login=$username&session_id=$sessionidm&amp;amp;FormName=group_book");
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, 1);
$html = curl_exec($ch);
$mn="<select size=\"5\" name=\"D1\" multiple>";
$firstselect = strpos($html,$mn);
$ms = "</select><BR>";
$lastselect = strpos($html,$ms);
$diff = $lastselect - $firstselect;
$finallist = substr($html,$firstselect,$diff);
echo ("".$finallist);
?>
gerpg
Easiest way to do it would be:
- RSS feeds (if the site supplies one)
-Iframes (not supported in NetScape 4 or lower)
- or use the Include(); function

Be careful when requesting information from other websites though. Copyright is becoming an increasing problem amoungst internet sites, just remember to request to use the information and to correctly source where its from.

Louis.
mraek
manav wrote:
cURL is definitely an excellent option


I can't use it. With just this line:
$ch = curl_init();
I get this error:
Fatal error: Call to undefined function curl_init() in /home/...

Is there something I need to do to enable Curl? I'm on server 1.
ashok
Find out whether curl is enabled in server 1 or not... take a blank php page, and execute php_info() to get the php configuration on your server...
rvec
http://www.frihost.com/phpinfo.php

Like that Smile
Code:
cURL support    enabled
Related topics
Most original website contest
Does anyone know how to get your site to appear higher ?
For Those Of You Who Like To Upload Music To The Net
Free SEO: Quality Websites Only Please
[Dúvidas] O que é DNS e serve para que?
Best websites useful for every one
Page with useful info on creating websites
Websites for my portfolio
JavaScript (HTML DOM) on Extern sites.
Submit your websites here!
Rapidshare Shares Uploader Info with Rights Holders
How to Access Blocked Websites
Directory Top 11 Websites?
Is it immoral to write a story where the characters suffer?
Reply to topic    Frihost Forum Index -> Scripting -> Php and MySQL

FRIHOST HOME | FAQ | TOS | ABOUT US | CONTACT US | SITE MAP
© 2005-2011 Frihost, forums powered by phpBB.