Hi there,
Working with a client who's after some remote-server monitoring, well, something along the lines of a web-page that fetches the Server load, uptime etc from a remote location and shows it on the main page.
But, they don't just want a list of servers, they also want somebody to be able to 'search' for a Node and get the current status of it.
Now I've thought of a couple of ways to do this, one of which gathering every 5 minutes into a mysql db and storing results through Cron Scripts, then I thought of just making it retreive an output of a "status.php" on the remote node; something that displays in simple form "$UPTIME \n $LOAD"and I can just explode it into an array and do it that way.
I'm just wondering, at what point would 'file_get_contents' start to lag? I mean, is it safe to assume that getching a .php 2KB file asking for a load average and uptime unix command from ~50 servers at once would be acceptable without causing too much of a problem? Obviously the page needs to show little to no decrease in speed in comparison to the rest of the site.
Would anybody have any suggestions?