Looking for more information on how to do PHP the right way? Check out PHP: The Right Way

Exporting Drupal Nodes with PHP and Drush
Oct 06, 2015 @ 11:09:11

The php[architect] site has posted a tutorial showing you how to export Drupal nodes with Drush and a bit of regular PHP. Drush is a command line tool that makes working with Drupal outside of the interface simpler and easier to automate.

Drupal 8 development has shown that PHP itself, and the wider PHP community, already provides ways to solve common tasks. In this post, I’ll show you some core PHP functionality that you may not be aware of; pulling in packages and libraries via Composer is a topic for another day.

The tutorial walks through a more real-world situation of needing to export a CSV file that shows a list of nodes added to the site after a specific date. He points out some of the benefits of doing it the Drush way and starts in on the code/configuration you need to set the system up. He shows how to create the Drush command itself and update it with a method to export the newest nodes (after validating the date provided). He makes use of a SplFileObject to output the results from the EntityFieldQuery query out into to the CSV file. He makes use of PHP's generators functionality to only fetch the records a few at a time. Finally he includes the command to execute the export, defining the date to query the node set and how to push that output to a file.

tagged: export drupal node drush commmandline csv output query generator

Link: https://www.phparch.com/2015/10/exporting-drupal-nodes-with-php-and-drush/

Grok Interactive:
Importing Large CSV Files with PHP Part 1: Import Using One Query
Sep 23, 2015 @ 12:19:33

The Grok Interactive blog has posted a tutorial, the first part in a series, showing you how to work with large CSV files in PHP.

Importing CSV files into your application can become a problem when the file is really big, > 65,000 rows big. Each row of the file needs to be parsed, converted into an object, and then saved to a database. All of this needs to happen within a 30 second timeout window. It may sound like an impossible task, but there are actually a couple of solutions that can solve this problem. While working on a project at Grok, I was tasked with doing exactly that.

He talks about the method he tried initially for parsing the large files, splitting it up into different files and processing them as chunks. He points out that it relies on the file system, though, and this made it difficult to debug. He finally came up with a different, more simple solution: importing the files directly into MySQL via a LOAD DATA LOCAL INFILE command. He shows how to set this up in a controller and "importer" class that handles the upload and import via the importFileContents method (complete code included). He walks through what the code is doing and includes a few notes about the configuration of the database connection to specify additional options on the PDO connection to allow the local file load.

tagged: tutorial csv file import large processing chunk mysql load file query

Link: http://www.grok-interactive.com/blog/import-large-csv-into-mysql-with-php-part-1/

Coding Geek:
How does a relational database work
Aug 19, 2015 @ 09:49:41

You may have been using relational databases in your PHP applications for a long time (PHP loves MySQL after all) but you might not have ever dug deep enough to understand how those databases work internally. In this detailed tutorial from Coding Geek they dive way in and cover everything from the basics out to complex sorting, management components and query handling.

When it comes to relational databases, I can’t help thinking that something is missing. They’re used everywhere. [...] you can google by yourself “how does a relational database work” to see how few results there are. [...] Are relational databases too old and too boring to be explained outside of university courses, research papers and books?

As a developer, I HATE using something I don’t understand. And, if databases have been used for 40 years, there must be a reason. [...] Over the years, I’ve spent hundreds of hours to really understand these weird black boxes I use every day. Relational Databases are very interesting because they’re based on useful and reusable concepts. If understanding a database interests you but you’ve never had the time or the will to dig into this wide subject, you should like this article.

He covers a wide range of topics during the post:

  • O(1)) vs O(n2) (or how data sets are handled based on size)
  • Array, Tree and Hash table
  • Global overview (structure of the database system and its pieces)
  • Query manager
  • Statistics (and optimizing storage of the data)
  • Data manager
  • Deadlock
  • Logging

Each of these topics comes with a lot of explanation, examples of how the internals are functioning as well as diagrams to help make a bit more sense. If you've ever really wanted to know how that database you use functions, this is definitely the article to check out.

tagged: relational database indepth concepts lowlevel highlevel query optimization transaction buffer

Link: http://coding-geek.com/how-databases-work/

SitePoint PHP Blog:
Caching Hat-trick: Zend Opcache, Etags and Query Caching
Jul 13, 2015 @ 09:57:56

The SitePoint PHP blog has posted three tips on caching that can help speed up your application from the processing level up. The article shares tips on using opcode caching for faster processing, etags for web request caching and query caching on the data side.

In this article, we will be looking at some of the common caching techniques in PHP: Opcache, Expires Headers and Query Caching in MySQL. We’ll look at additional approaches in part 2.

He starts with an introduction to the request lifecycle of a typical request made to a PHP-based application, from the fetching of a file to the actual execution. This lays the groundwork for the first kind of caching: opcodes for caching execution results. He helps you get that enabled and configured and shows how to determine how much it's actually helping. Following this he talks about the "expires" headers you can send from Apache, telling the browser exactly when it needs to fetch new versions of things like CSS, image or Javascript files. Finally he touches on MySQL query caching, storing the already parsed version of a query on the server with results in a cache for faster polling on repeated requests.

tagged: caching zend opcode etags expires query caching tutorial speed performance

Link: http://www.sitepoint.com/caching-hat-trick-zend-opcache-etags-and-query-caching/

Michael Dyrynda:
Filtering models with Eloquent in Laravel
Mar 06, 2015 @ 10:14:12

Michael Dyrynda has a recent post about handling matching and limiting results in Eloquent models in a Larvel-based application.

Say you have a users table with the following fields in it name, email, city, state, zip. You may want to provide fuzzy searching for the name, email, or city and exact matching for the state and zipfields. Why fuzzy matching for only some of the fields? Well, you might want to search for everyone whose name contains Michael or has has an @gmail.com address. Be mindful of the latter; it will expose a large dataset if you're not careful in restricting access to the functionality. You probably wouldn't want to allow it in anything bigger than a proof of concept (which this is!).

He goes through the model process, showing how to set up a simple model with the fields mentioned and make use of query scopes to limit returned results. Code is included showing how to define the "scopeFilter" method in the model and call the "User" model instance with the "filter" method. The example limits the results to only the users with a value in the "name" and "state" field.

tagged: filter model results tutorial eloquent laravel scope query

Link: https://iatstuti.net/blog/filtering-models-with-eloquent-in-laravel

Kevin Schroeder:
If you develop for Magento, know your indexes
Feb 02, 2015 @ 09:34:19

Kevin Schroeder makes a suggestion to all of the Magento developers out there - be sure to know your indexes and how to use them to your advantage.

When I first got into Magento development, in my mind, there were two ways of getting data from the database. You would either call Mage::getModel(‘catalog/product’)->load($id) or you would work with the collection. If you wanted to get a filtered list of something you would use the ORM to get it. But as I’ve gained more experience (fairly quickly, I might add) I realized that there was more to the puzzle. A good portion of this is because I work with Magento ECG and some of the best Magento devs and architects can be found there and I’m a quick learner.

He gives an example of going beyond the usual one-to-one relationship most people use with Magento's models. He includes an example of wanting to fetch a list of all products in the same category as another and the "anit-pattern" that comes with it. Instead he offers the solution of an index, a simple one that merges the catalog category and product index ID. This makes using a custom query with a handy join much easier and much faster.

tagged: magento database collection query index tutorial category

Link: http://www.eschrade.com/page/if-you-develop-for-magento-know-your-indexes/

SitePoint PHP Blog:
Geospatial Search with SOLR and Solarium
Nov 25, 2014 @ 13:55:56

The SitePoint PHP blog has a new post from Lukas White that gets into the details of combining SOLR searching with Solarium to perform geospatial queries.

In a recent series of articles I looked in detail at Apache’s SOLR and Solarium. To recap; SOLR is a search service with a raft of features – such as faceted search and result highlighting – which runs as a web service. Solarium is a PHP library which allows you to integrate with SOLR – whether local or remote – interacting with it as if it were a native component of your application. If you’re unfamiliar with either, then my series is over here, and I’d urge you to take a look. In this article, I’m going to look at another part of SOLR which warrants its own discussion; Geospatial search.

He uses a simple example, locating airports near a given location, to give a more "real world" idea of how it all works. He starts by introducing the concept of geospatial searching and the idea of "points" as they relate to a specific location. He then gets into the actual setup of the application, including the SOLR schema configuration and making the queries on the data. The Solarium library allows for simple location queries when given just the "latlong" helper type and the location/distance to use for the starting point. He uses the data from the OpenFlights service to gather the airport data and creates a search form and basic list output of the results from searches on it. If you'd like to see the end result in action, check out this demo website.

tagged: solr search solarium library tutorial geospatial query airport demo

Link: http://www.sitepoint.com/geospatial-search-solr-solarium/

Anna Filina:
Reduce number of queries
Oct 29, 2014 @ 10:53:10

In her most recent post Anna FIlina makes a recommendation to those looking to increase the performance of an application, especially one that's already in place: simply reduce the number of queries. It sounds simple enough, but can sometimes prove to be difficult depending on the application.

Customers often call me because their site is slow. One of the most common problems I found was a high number of queries that get executed for every single page hit. When I say a lot, I mean sometimes more than 1000 queries for a single page. This is often the case with a CMS which has been customized for the client’s specific needs.

In this article, aimed at beginner to intermediate developers, I will explain how to figure out whether the number of queries might be a problem, how to count them, how to find spots to optimize and how to eliminate most of these queries. I will focus specifically on number of queries, otherwise I could write a whole tome. I’ll provide code examples in PHP, but the advice applies to every language.

She suggests starting from "the top", looking at the browser's own information on which pieces of data are taking the longest to return back to the client (the latency). This gives a starting direction and tells you where to look for the worst offenders. She talks about a technique to locate and count the queries being made and some common issues found in multiple kinds of software (hint: loops). Then she gets down to the optimization - combining similar queries and better queries through joins.

tagged: query database performance join similar tips

Link: http://afilina.com/reduce-number-of-queries/

SitePoint PHP Blog:
Essentials of LDAP with PHP
Sep 26, 2014 @ 09:07:37

On the SitePoint PHP blog today Matthew Setter has written up a tutorial sharing the essentials of PHP with LDAP. He shows how to connect PHP to this industry standard technology and effectively query, update and delete information.

Ever wanted a simple way to store address book style information and network information actually next to any kind of ordered information? If so, then there’s a technology which has been around since 1993, one which despite not having the cool factor of such technologies as Node.js and Go, allows you to do exactly this. It’s called LDAP!

He starts off the tutorial by explaining a bit about what LDAP is (and isn't) for those not familiar with it. He covers some of the basic terminology, pointing you other articles if you need more than just his brief overview. Then he helps you get an LDAP server installed locally (using a package manager, apt-get) and how to verify the install is working correctly. From there he shows how to populate a few records and verify they exist. Following this, he gets to the PHP part of things, showing how to use the Zend Framework v2 Zend/Ldap component to access the server, query records and update/delete them easily.

tagged: tutorial ldap introduction query crud essentials zendframework zendldap

Link: http://www.sitepoint.com/essentials-ldap-php/

SitePoint PHP Blog:
PINQ – Querify Your Datasets – Faceted Search
Aug 26, 2014 @ 10:58:22

The SitePoint PHP blog has continued their series showing the use of the PINQ library for PHP (a PHP implementation of the LINQ tool). In part one they introduced the tool and showed how to it could be used to query and sort data. In this second part they move on and show how to perform a multi-faceted search on data from a MySQL database.

We are not going to cover the full aspect of faceted search in this series. Interested parties can refer to relevant articles published on Sitepoint and other Internet publications. [...] Unfortunately, faceted search is not a built-in feature provided by MySQL yet. What can we do if we are using MySQL but also want to provide our users with such a feature? With PINQ, we’ll see there is an equally powerful and straightforward approach to achieving this as when we are using other DB engines – at least in a way.

Building from the code from the first part of the series, they create a few more simple routes that let you define the different facets to use for the searching/sorting. He creates a custom facet class that uses the "traversable" handling of the PINQ to do the data manipulation. He creates a few different facet objects, each creating a customized filter. finally, he ties it all back into the endpoint and includes the updated markup to show the results. He finishes up the post mentioning a few limitations and improvements that could be made on the example as well.

tagged: pinq query dataset mysql faceted search tutorial series part2

Link: http://www.sitepoint.com/pinq-querify-datasets-faceted-search/