Looking for more information on how to do PHP the right way? Check out PHP: The Right Way

Exporting Drupal Nodes with PHP and Drush
Oct 06, 2015 @ 11:09:11

The php[architect] site has posted a tutorial showing you how to export Drupal nodes with Drush and a bit of regular PHP. Drush is a command line tool that makes working with Drupal outside of the interface simpler and easier to automate.

Drupal 8 development has shown that PHP itself, and the wider PHP community, already provides ways to solve common tasks. In this post, I’ll show you some core PHP functionality that you may not be aware of; pulling in packages and libraries via Composer is a topic for another day.

The tutorial walks through a more real-world situation of needing to export a CSV file that shows a list of nodes added to the site after a specific date. He points out some of the benefits of doing it the Drush way and starts in on the code/configuration you need to set the system up. He shows how to create the Drush command itself and update it with a method to export the newest nodes (after validating the date provided). He makes use of a SplFileObject to output the results from the EntityFieldQuery query out into to the CSV file. He makes use of PHP's generators functionality to only fetch the records a few at a time. Finally he includes the command to execute the export, defining the date to query the node set and how to push that output to a file.

tagged: export drupal node drush commmandline csv output query generator

Link: https://www.phparch.com/2015/10/exporting-drupal-nodes-with-php-and-drush/

Grok Interactive:
Importing Large CSV Files with PHP Part 1: Import Using One Query
Sep 23, 2015 @ 12:19:33

The Grok Interactive blog has posted a tutorial, the first part in a series, showing you how to work with large CSV files in PHP.

Importing CSV files into your application can become a problem when the file is really big, > 65,000 rows big. Each row of the file needs to be parsed, converted into an object, and then saved to a database. All of this needs to happen within a 30 second timeout window. It may sound like an impossible task, but there are actually a couple of solutions that can solve this problem. While working on a project at Grok, I was tasked with doing exactly that.

He talks about the method he tried initially for parsing the large files, splitting it up into different files and processing them as chunks. He points out that it relies on the file system, though, and this made it difficult to debug. He finally came up with a different, more simple solution: importing the files directly into MySQL via a LOAD DATA LOCAL INFILE command. He shows how to set this up in a controller and "importer" class that handles the upload and import via the importFileContents method (complete code included). He walks through what the code is doing and includes a few notes about the configuration of the database connection to specify additional options on the PDO connection to allow the local file load.

tagged: tutorial csv file import large processing chunk mysql load file query

Link: http://www.grok-interactive.com/blog/import-large-csv-into-mysql-with-php-part-1/

Ignace Nyamagana Butera:
Q&A: Enforcing enclosure with LeagueCsv
Sep 04, 2015 @ 11:19:44

Ignace Nyamagana Butera has a post has a post to his site showing how to use the LeagueCsv library for encapsulation in CSV output.

It is common knowledge that PHP’s fputcsv function does not allow enforcing the enclosure on every field. Using League CSV and PHP stream filter features let me show you how to do so step by step.

He walks you through the process of getting the library installed and using it (seven easy steps) to correctly contain the CSV values according to its contents:

  • Install league csv
  • Choose a sequence to enforce the presence of the enclosure character
  • Set up you CSV
  • Enforce the sequence on every CSV field
  • Create a stream filter
  • Attach the stream filter to the Writer object

Each step includes the code you'll need to make it work and a final result is shown at the end of the post. He does offer a few extra tips at the end of the post around some extra validation he added and where you can register the stream filter.

tagged: leaguecsv csv data output encapsulation stream filter

Link: http://nyamsprod.com/blog/2015/qa-enforcing-enclosure-with-leaguecsv/

BitExpert Blog:
Think About It: PHP/PostgreSQL Bulk Performance (Part 3)
Jul 24, 2015 @ 10:46:06

On the bitExpert blog they've continued their "Think About It" series of posts looking at optimizations that can be made to different technologies in their stack to increase performance. In this third part of the series they focus in on the changes made to help speed things up with the PostgreSQL database backend.

This article is the last of a three-part series and describes how we optimized the persistence process of bulk data in our code in combination with PostgreSQL. Make sure you covered the first article about how we tweaked PHPExcel to run faster while reading Excel and CSV files and the second article about how we optimized our data processing and reached performance improvements tweaking our code.

They work from the example code provided at the end of part two and update the "update" handling to optimize it a bit. By default it executes an update query for each record so, instead, they modified it to perform a bulk update with an "update from values" format. They could then migrate to a "save all" handler with the complete set of records to save.

tagged: performance postgresql bulk series part3 tutorial phpexcel excel csv

Link: https://blog.bitexpert.de/blog/think-about-it-php-postgresql-bulk-performance-part-3/

David Lundgren:
SPL FileObject & LimitIterator
Jun 24, 2015 @ 08:04:24

In the latest post to his site David Lundgren takes a look at two pieces of PHP's SPL libraries - the FileObject and LimitIterator.

Over that last couple of weeks I've come to use the SPL far more than I have in the past. The SplFileObject for reading CSV files is far more convenient than the fgetcsv() function, and associated code needed for a CSV file. Using the LimitIterator allowed me to easily bypass the first row of the CSV, as they were headers and I knew the format of those headers.

He includes an example of using these two to read from a CSV file, processing the header information and each row following. He also gives another example of the LimitIterator handing the results of a database query, reducing the array set down to only the first twelve items. You can find out more about these two handy tools in their SPL documentation, FileObject and LimitIterator, as well as the rest of the SPL if you haven't looked into it before.

tagged: spl standardphplibrary fileobject limititerator csv database results

Link: http://davidscode.com/blog/2015/06/22/spl-fileobject-limititerator/

BitExpert Blog:
Processing CSV files in a memory efficient way
Apr 23, 2015 @ 10:50:59

In their latest post Florian Horn shares some of his experience in using the PHPExcel tool to parse CSV files and the performance issues he ran into. Fortunately, he found a solution...in the form of another library.

A little while ago I had to dive deeper into the performance optimized usage of PHPExcel. Our users are uploading files like Excel or CSV with a lot data to process. Initially we used the PHPEXcel instance without any tuning of the default configuration which lead to heavy memory issues on relativly small files. So I had to avoid reading all file content at ones to the buffer (like file_get_contents does).

In my research mainly optimizing the usage of PHPExcel I came across a tiny library I am grown really fond of. It is called Goodby/CSV. Both tools have a very well grounded documentation to read in and understand the basics and the usage.

He describes some of the main differences between the two tools and includes some basic benchmark results comparing memory consumption and overall speed.

tagged: phpexcel csv file goodbycsv process performance memory benchmark

Link: https://blog.bitexpert.de/blog/processing-csv-files-in-a-memory-efficient-way/

5 Inspiring (and Useful) PHP Snippets
Jul 02, 2012 @ 10:58:45

On PHPMaster.com there's a new tutorial that shares some useful PHP snippets that you could use in your development.

"X PHP Snippets" type articles abound on the Internet, so why write another one? Well, let's face it… the PHP snippets in them are generally lame. Snippets that generating a random string or return $_SERVER["REMOTE_ADDR"] for the IP Address of a client really aren't that interesting and are of modest usefulness. Instead, here's five snippets that you'll no doubt find interesting and useful, presented along with the problems that inspired them. I hope the creativity in many of them inspire you to write better and more creative code in your own day-to-day endeavors.

Their "five tips" are about:

  • Generating CSV files from an array of data
  • Autoloading classes (in a PSR-0 way)
  • Parsing data with the unpack function
  • Templating in HTML (creating a "View" object)
  • Using file_get_contents as a cURL Alternative
tagged: snippets csv autoload unpack template filegetcontents


Generating CSV file using CodeIgniter Framework
Apr 19, 2012 @ 11:45:52

The Code2Learn site has posted another in their CodeIgniter "series" about producing various kinds of output from an application based on this framework. In this new article Farhan Khwaja shows how to output a CSV-formatted file.

I have already written posts on how to generate pdf files using CodeIgniter Framework and also on how to generate tabulated pdf file using CodeIgniter Framework. This post will help you to generate a CSV file using CodeIgniter. The data for the CSV File will be taken from the MySQL Database and will be put into the CSV File.

He includes the source for a basic "Generate" controller class that uses a custom "CSV_Helper" to do the work. It has two methods - one to transform array data and another to take the database result object and extract each record.

tagged: generate csv file codeigniter framework tutorial output helper


Lorna Mitchell's Blog:
Github to Jira Bug Migration Script
Mar 09, 2011 @ 10:18:18

As part of a migration the Joind.in project made to track their bugs on a hosted Jira instance instead of the Issue Tracker on Github, Lorna Mitchell, one of the leads on the project has written up an import script she used to move current issues. The code is in her latest post.

I migrated only our open issues, and comments (and the comments ended up a bit weirdly formatted on the other end but this was the best they could do). It was nothing pretty or clever but in case it's useful to someone else.

The script connects to the github API and pulls down the information for the open issues including their titles, user and body of the issue. This is then used to make another connection for each to fetch their comments. The whole thing is dumped out to a CSV file that can be easily imported by the Jira team.

tagged: github jira import issue list bug migrate csv api


Phil Sturgeon's Blog:
PHP Format abstraction with a simple class
Feb 14, 2011 @ 12:15:13

Phil Sturgeon has posted about a utility he's released that can help convert data from one state to another such as arrays to JSON and XML to CSV - php-format.

Convert between Array, Object, JSON, XML, CSV and Serialized data and back again easily. I'll add a few more types like YAML when I can be arsed to work out PECL for MAMP.

There's a code snippet in his post showing the conversion of an array to JSON then to XML then back to an array. The code is pretty simple to follow with "to" and "from" methods for each format making it simple to extend for your own uses.

tagged: convert data json xml csv serialize