Looking for more information on how to do PHP the right way? Check out PHP: The Right Way

Muhammad Zamroni:
Streaming CSV Using PHP
Feb 16, 2018 @ 15:19:47

On his Medium.com site Muhammad Zamroni has a quick tutorial posted showing how to create a system that will [stream CSV data] in a Laravel application (https://medium.com/@matriphe/streaming-csv-using-php-46c717e33d87) back to the waiting client.

In one of our application, there’s a need to get list of data from a service. This data is used to generate report by background process, resulting a XLSX file that can be downloaded or attached to email. This service (an API) is written in Laravel. The background process is also written in Laravel. Both service are hosted in different server.

We pull data from MySQL and simply send the response in JSON to be easily parsed. The problem we encountered was the amount of data.

The main issue was the memory that it required to pull in all of the data and work with it. Based on some suggestions from another article they decided to switch from JSON to CSV for output and use the chunk handling to process pieces of data at a time. He includes the code for the controller that shows the use of chunk and a manual file pointer to push the data into the response 1000 records at a time.

tagged: stream csv content response laravel chunk tutorial

Link: https://medium.com/@matriphe/streaming-csv-using-php-46c717e33d87

php[architect]:
Exporting Drupal Nodes with PHP and Drush
Oct 06, 2015 @ 16:09:11

The php[architect] site has posted a tutorial showing you how to export Drupal nodes with Drush and a bit of regular PHP. Drush is a command line tool that makes working with Drupal outside of the interface simpler and easier to automate.

Drupal 8 development has shown that PHP itself, and the wider PHP community, already provides ways to solve common tasks. In this post, I’ll show you some core PHP functionality that you may not be aware of; pulling in packages and libraries via Composer is a topic for another day.

The tutorial walks through a more real-world situation of needing to export a CSV file that shows a list of nodes added to the site after a specific date. He points out some of the benefits of doing it the Drush way and starts in on the code/configuration you need to set the system up. He shows how to create the Drush command itself and update it with a method to export the newest nodes (after validating the date provided). He makes use of a SplFileObject to output the results from the EntityFieldQuery query out into to the CSV file. He makes use of PHP's generators functionality to only fetch the records a few at a time. Finally he includes the command to execute the export, defining the date to query the node set and how to push that output to a file.

tagged: export drupal node drush commmandline csv output query generator

Link: https://www.phparch.com/2015/10/exporting-drupal-nodes-with-php-and-drush/

Grok Interactive:
Importing Large CSV Files with PHP Part 1: Import Using One Query
Sep 23, 2015 @ 17:19:33

The Grok Interactive blog has posted a tutorial, the first part in a series, showing you how to work with large CSV files in PHP.

Importing CSV files into your application can become a problem when the file is really big, > 65,000 rows big. Each row of the file needs to be parsed, converted into an object, and then saved to a database. All of this needs to happen within a 30 second timeout window. It may sound like an impossible task, but there are actually a couple of solutions that can solve this problem. While working on a project at Grok, I was tasked with doing exactly that.

He talks about the method he tried initially for parsing the large files, splitting it up into different files and processing them as chunks. He points out that it relies on the file system, though, and this made it difficult to debug. He finally came up with a different, more simple solution: importing the files directly into MySQL via a LOAD DATA LOCAL INFILE command. He shows how to set this up in a controller and "importer" class that handles the upload and import via the importFileContents method (complete code included). He walks through what the code is doing and includes a few notes about the configuration of the database connection to specify additional options on the PDO connection to allow the local file load.

tagged: tutorial csv file import large processing chunk mysql load file query

Link: http://www.grok-interactive.com/blog/import-large-csv-into-mysql-with-php-part-1/

Ignace Nyamagana Butera:
Q&A: Enforcing enclosure with LeagueCsv
Sep 04, 2015 @ 16:19:44

Ignace Nyamagana Butera has a post has a post to his site showing how to use the LeagueCsv library for encapsulation in CSV output.

It is common knowledge that PHP’s fputcsv function does not allow enforcing the enclosure on every field. Using League CSV and PHP stream filter features let me show you how to do so step by step.

He walks you through the process of getting the library installed and using it (seven easy steps) to correctly contain the CSV values according to its contents:

  • Install league csv
  • Choose a sequence to enforce the presence of the enclosure character
  • Set up you CSV
  • Enforce the sequence on every CSV field
  • Create a stream filter
  • Attach the stream filter to the Writer object

Each step includes the code you'll need to make it work and a final result is shown at the end of the post. He does offer a few extra tips at the end of the post around some extra validation he added and where you can register the stream filter.

tagged: leaguecsv csv data output encapsulation stream filter

Link: http://nyamsprod.com/blog/2015/qa-enforcing-enclosure-with-leaguecsv/

BitExpert Blog:
Think About It: PHP/PostgreSQL Bulk Performance (Part 3)
Jul 24, 2015 @ 15:46:06

On the bitExpert blog they've continued their "Think About It" series of posts looking at optimizations that can be made to different technologies in their stack to increase performance. In this third part of the series they focus in on the changes made to help speed things up with the PostgreSQL database backend.

This article is the last of a three-part series and describes how we optimized the persistence process of bulk data in our code in combination with PostgreSQL. Make sure you covered the first article about how we tweaked PHPExcel to run faster while reading Excel and CSV files and the second article about how we optimized our data processing and reached performance improvements tweaking our code.

They work from the example code provided at the end of part two and update the "update" handling to optimize it a bit. By default it executes an update query for each record so, instead, they modified it to perform a bulk update with an "update from values" format. They could then migrate to a "save all" handler with the complete set of records to save.

tagged: performance postgresql bulk series part3 tutorial phpexcel excel csv

Link: https://blog.bitexpert.de/blog/think-about-it-php-postgresql-bulk-performance-part-3/

David Lundgren:
SPL FileObject & LimitIterator
Jun 24, 2015 @ 13:04:24

In the latest post to his site David Lundgren takes a look at two pieces of PHP's SPL libraries - the FileObject and LimitIterator.

Over that last couple of weeks I've come to use the SPL far more than I have in the past. The SplFileObject for reading CSV files is far more convenient than the fgetcsv() function, and associated code needed for a CSV file. Using the LimitIterator allowed me to easily bypass the first row of the CSV, as they were headers and I knew the format of those headers.

He includes an example of using these two to read from a CSV file, processing the header information and each row following. He also gives another example of the LimitIterator handing the results of a database query, reducing the array set down to only the first twelve items. You can find out more about these two handy tools in their SPL documentation, FileObject and LimitIterator, as well as the rest of the SPL if you haven't looked into it before.

tagged: spl standardphplibrary fileobject limititerator csv database results

Link: http://davidscode.com/blog/2015/06/22/spl-fileobject-limititerator/

BitExpert Blog:
Processing CSV files in a memory efficient way
Apr 23, 2015 @ 15:50:59

In their latest post Florian Horn shares some of his experience in using the PHPExcel tool to parse CSV files and the performance issues he ran into. Fortunately, he found a solution...in the form of another library.

A little while ago I had to dive deeper into the performance optimized usage of PHPExcel. Our users are uploading files like Excel or CSV with a lot data to process. Initially we used the PHPEXcel instance without any tuning of the default configuration which lead to heavy memory issues on relativly small files. So I had to avoid reading all file content at ones to the buffer (like file_get_contents does).

In my research mainly optimizing the usage of PHPExcel I came across a tiny library I am grown really fond of. It is called Goodby/CSV. Both tools have a very well grounded documentation to read in and understand the basics and the usage.

He describes some of the main differences between the two tools and includes some basic benchmark results comparing memory consumption and overall speed.

tagged: phpexcel csv file goodbycsv process performance memory benchmark

Link: https://blog.bitexpert.de/blog/processing-csv-files-in-a-memory-efficient-way/

PHPMaster.com:
5 Inspiring (and Useful) PHP Snippets
Jul 02, 2012 @ 15:58:45

On PHPMaster.com there's a new tutorial that shares some useful PHP snippets that you could use in your development.

"X PHP Snippets" type articles abound on the Internet, so why write another one? Well, let's face it… the PHP snippets in them are generally lame. Snippets that generating a random string or return $_SERVER["REMOTE_ADDR"] for the IP Address of a client really aren't that interesting and are of modest usefulness. Instead, here's five snippets that you'll no doubt find interesting and useful, presented along with the problems that inspired them. I hope the creativity in many of them inspire you to write better and more creative code in your own day-to-day endeavors.

Their "five tips" are about:

  • Generating CSV files from an array of data
  • Autoloading classes (in a PSR-0 way)
  • Parsing data with the unpack function
  • Templating in HTML (creating a "View" object)
  • Using file_get_contents as a cURL Alternative
tagged: snippets csv autoload unpack template filegetcontents

Link:

Code2Learn.com:
Generating CSV file using CodeIgniter Framework
Apr 19, 2012 @ 16:45:52

The Code2Learn site has posted another in their CodeIgniter "series" about producing various kinds of output from an application based on this framework. In this new article Farhan Khwaja shows how to output a CSV-formatted file.

I have already written posts on how to generate pdf files using CodeIgniter Framework and also on how to generate tabulated pdf file using CodeIgniter Framework. This post will help you to generate a CSV file using CodeIgniter. The data for the CSV File will be taken from the MySQL Database and will be put into the CSV File.

He includes the source for a basic "Generate" controller class that uses a custom "CSV_Helper" to do the work. It has two methods - one to transform array data and another to take the database result object and extract each record.

tagged: generate csv file codeigniter framework tutorial output helper

Link:

Lorna Mitchell's Blog:
Github to Jira Bug Migration Script
Mar 09, 2011 @ 16:18:18

As part of a migration the Joind.in project made to track their bugs on a hosted Jira instance instead of the Issue Tracker on Github, Lorna Mitchell, one of the leads on the project has written up an import script she used to move current issues. The code is in her latest post.

I migrated only our open issues, and comments (and the comments ended up a bit weirdly formatted on the other end but this was the best they could do). It was nothing pretty or clever but in case it's useful to someone else.

The script connects to the github API and pulls down the information for the open issues including their titles, user and body of the issue. This is then used to make another connection for each to fetch their comments. The whole thing is dumped out to a CSV file that can be easily imported by the Jira team.

tagged: github jira import issue list bug migrate csv api

Link:


Trending Topics: