News Feed
Sections




News Archive
feed this:

Looking for more information on how to do PHP the right way? Check out PHP: The Right Way

Phil Sturgeon:
Benchmarking Codswallop NodeJS v PHP
November 12, 2013 @ 09:21:29

Phil Sturgeonhas posted about some Node.js vs PHP benchmarks that someone linked him to concerning web scraping. The article suggests that Node.js "owns" PHP when it comes to this but, as Phil finds out, there's a bit more to the story than that.

Sometimes people link me to articles and ask for my opinions. This one was a real doozy. Oh goody, a framework versus language post. Let's try and chew through this probable linkbait [where] we're benchmarking NodeJS v PHP. Weird, but I'll go along with it. Well, now we're testing cheerio v PhpQuery which is a bit different, but fine, let's go along with it.

Through a little discovery, Phil noticed phpQuery using file_get_contents, a blocking method for fetching the remote pages to scrape. Node.js instead uses a non-blocking method, meaning multiple files can be fetched at the same time. In answer to this blocking vs non-blocking, he decided to run benchamrks against a few cases - Node.js/Cherrio, PHP/phpQuery and his own, more correct comparison to the Node option - PHP/ReactPHP/phpQuery. He's shared his results, showing a major difference between the straight phpQuery and the React-based version.

It seems likely to me that people just assume PHP can't do this stuff, because by default most people arse around PHP with things like MAMP, or on their shitty web-host where is is hard to install things and as such get used to writing PHP without utilizing many extensions. It is probably exactly this which makes people think PHP just can't do something, when it easily can.
0 comments voice your opinion now!
nodejs reactphp webpage scraping benchmark compare

Link: http://philsturgeon.co.uk/blog/2013/11/benchmarking-codswallop-nodejs-v-php

Stuart Herbert's Blog:
PHP Components Shipping Web Pages With Your Components
August 16, 2011 @ 13:13:06

Stuart Herbert's latest post in his "PHP Components" series looks at an optional but handy thing you can include in your component's package - web pages (be they a manual or other kind of information). This new post talks about where they should lie in the component's package structure.

I'm now going under the bonnet of our components, and looking at the different file roles that the PEAR installer expects to find when we distribute our component as a PEAR-compatible package. It isn't very often that a component needs to ship web pages too, but should the need arise, here's how to do it.

He starts by defining what a "web page" could be (HTML, Javascript, CSS, etc) and gives the place in the hierarchy they should fit. When you use the PEAR client to install the package, these files are placed in the "www" folder of your PEAR installation.

0 comments voice your opinion now!
component webpage structure tutorial pear


TechFounder.net:
Making web-pages go faster using PHP
November 17, 2008 @ 08:42:16

The TechFounder blog has a few general tips you can use to help your web pages go a bit faster:

As it might be expected, there are several techniques to optimize the delivery of web pages. The Exceptional Performance guide by Yahoo is a great resource for a multitude of optimizations practices, including specifically two techniques which I will address in this article - script minifcation and concatenation.

Suggestions include reducing total request counts and minification of external libraries via the Minify tool.

0 comments voice your opinion now!
webpage speed load minify performance tutorial


DeveloperTutorials.com:
Scraping Links With PHP
January 14, 2008 @ 08:44:00

The Developer Tutorials site has posted a new article covering the creation of a small application that can help you scrape content from a remote page and pull it into your script.

In this tutorial you will learn how to build a PHP script that scrapes links from any web page.

You'll learn to use cURL, the DOM functions, XPath and a bit of MySQL to get the job done. It's nice to see that they also include a section looking at one of the more touchy aspects of web page scraping - "is it legal?"

0 comments voice your opinion now!
scraping webpage remote curl xpath dom mysql scraping webpage remote curl xpath dom mysql


DevShed:
Building Object-Oriented Web Pages with Inheritance in PHP 5
July 16, 2007 @ 15:18:00

DevShed has posted part one of a new pair of articles that show how to use inheritance in your PHP applications.

In the two articles in this series I'm going to show you how to build a sample object-based web site from its bare bones structure, by using the encapsulated logic of some parent and child PHP 5 classes. In this way I'll demonstrate how inheritance can be used to tackle a concrete project, such as constructing dynamic web pages.

In part one, they build the foundation of their sample application - a web page builder - by defining the WebPage class and abstract methods for it like buildHeader, buildStyles and buildBody. Inheriting from this, they build a HomeWebPage class that defines the methods and values to create a simple page. They take it even a step further and build an AboutUsWebPage to show another implementation.

0 comments voice your opinion now!
php5 object oriented inhertance tutorial webpage php5 object oriented inhertance tutorial webpage


Hasin Hayder's Blog:
Creating Thumbnail of WebPages using WebThumb API
September 06, 2006 @ 06:15:15

From Hasin Hayder's blog, there's a new tutorial demonstrating how to use the newly released WebThumb API from Joshua Eichorn to create thumbnails of websites dynamically.

Using WebThumb API, you can generate a thumbnail in three steps. First you have to place a request containing the URL. As soon as your request is successful, WebThumb store your request in queue. That means you are not getting the thumbnail instantly (well, there are other factors also. to fetch an url requires time, so it is not possible to generate the thumbnail in real time).

In second step you have to check whether your thumbnail has been generated or it is still in the queue. If you get a green signal, you will proceed to the third step where you have to request a download URL of your thumbnails.

He shows how to make a request to the API, check how your request is doing (status), and grabbing the thumbnail it's generated. Then, it's on to the code, and a complete PHP script (using cURL) to make the complete request to the API, including waiting for the image to be finished to grab it.

0 comments voice your opinion now!
webthumb api thumbnail webpage dynamic curl tutorial webthumb api thumbnail webpage dynamic curl tutorial



Community Events





Don't see your event here?
Let us know!


language framework package install laravel tips podcast interview community introduction series zendserver library opinion release deployment symfony list update api

All content copyright, 2014 PHPDeveloper.org :: info@phpdeveloper.org - Powered by the Solar PHP Framework