Looking for more information on how to do PHP the right way? Check out PHP: The Right Way

SitePoint PHP Blog:
We’re Building a Marvel Catalog Reader! Avengers, Assemble!
May 16, 2016 @ 13:23:08

On the SitePoint PHP blog they've shared a tutorial covering the construction of a Marvel Catalog Reader that hooks in to the Marvel API for its data.

In this tutorial, we’re going to take a look at the Marvel API, a tool provided by Marvel for developers to get access to the 70-plus years of Marvel comics data. First, we’ll walk through the steps in which one can acquire the keys needed to make requests to the API. Then, we’ll look at the tools we can use to test out the API. Finally, we’re going to build a website that uses the API.

They start out on the API side of things, showing you how to sign up for an account, get your token information and include a note about using the API (rate limiting and referencing the source of the images). There's a mention and example of working with the interactive API documentation and the first part of the code you'll need to make the connection. They then get into the construction of the site itself using the Laravel framework and a simple caching command. This is used to store the results from a query made via a Guzzle client. The focus then shifts to the frontend where they create the HomeController and define both the main endpoint and two others: one for viewing a specific comic and the other for the character listing. The tutorial continues on showing you how to handle the (paginated) responses from each of the calls and push the results into a cache record. Finally, they create the matching views of the API query results and some example screenshots of the results.

tagged: marvel api tutorial laravel frontend cache reader guzzle

Link: http://www.sitepoint.com/were-building-a-marvel-catalog-reader-avengers-assemble/

Loïc Faugeron:
Super Speed Symfony - nginx
Apr 20, 2016 @ 10:48:49

Loïc Faugeron has continued his series about speeding up Symfony applications and getting the best overall performance you can. In this new post he gets into more detail about tuning a Nginx web server (with PHP-FPM) and the web server's own caching features.

HTTP frameworks, such as Symfony, allow us to build applications that have the potential to achieve Super Speed.

We've already seen a first way to do so (by turning it into a HTTP server), another way would be to put a reverse proxy in front of it. In this article we'll take a Symfony application and demonstrate how to do so using nginx.

He starts by helping you get Nginx and PHP-FPM all set up and running on a Unix-based system (installed via apt-get). He provides a simple configuration including the user to run as and a virtual host for the application. There's a few command line checks to ensure it's working correctly and a bit of benchmarking as a baseline for the performance testing later. He then gets to the caching functionality and gives some of the basics on how it works inside of Nginx itself. He includes a basic caching configuration (caching to files) and adding this to the already created virtual host. Finally he includes sample Symfony code to send the "Cache-Control" header with every request and runs the benchmarks again (resulting in about 140x faster than without the cache).

tagged: tutorial nginx performance symfony speed phpfpm setup configuration cache cachecontrol

Link: https://gnugat.github.io/2016/04/20/super-speed-sf-nginx.html

Nginx.com:
Maximizing PHP 7 Performance with NGINX, Part I: Web Serving and Caching
Feb 29, 2016 @ 13:55:10

On the Nginx.com site they've posted the first part of a series showing you how to maximize your performance with PHP 7 and this already speedy web server.

PHP is the most popular way to create a server-side Web application, with roughly 80% market share. (ASP.net is a distant second, and Java an even more distant third.) [...] Now the PHP team is releasing a new version, PHP 7 – more than a decade after the introduction of PHP 5. During this time, usage of the web and the demands on websites have both increased exponentially.

[...] This blog post is the first in a two-part series about maximizing the performance of your websites that use PHP 7. Here we focus on upgrading to PHP 7, implementing open source NGINX or NGINX Plus as your web server software, rewriting URLs (necessary for requests to be handled properly), caching static files, and caching dynamic files (also called application caching or microcaching).

They start by looking at why "PHP hits a wall" in its execution in high load situations, stepping through the process it follows to handle each request. They also share some of the common ways PHP developers have combatted these issues including more hardware, better server software and multi-server setups. They then get into the actual tips themselves:

  • Tip 1. Upgrade to PHP 7
  • Tip 2. Choose Open Source NGINX or NGINX Plus
  • Tip 3. Convert Apache Configuration to NGINX Syntax
  • Tip 4. Implement Static File Caching
  • Tip 5. Implement Microcaching

For each tip there's a summary with more information on why they make the suggestion and, for some, how to make the transition happen. In the next part of the series they'll get into reverse proxy servers and a multi-server Nginx implementation to boost performance even more.

tagged: performance php7 nginx series part1 maximize tutorial static cache apache conversion

Link: https://www.nginx.com/blog/maximizing-php-7-performance-with-nginx-part-i-web-serving-and-caching/

Laravel News:
How To: Optimizing SSL on Laravel Forge
Jan 14, 2016 @ 09:27:59

On the Laravel News site there's a post showing you how to optimize your SSL support on Forge, the Laravel-related tool that makes creating and configuring servers simpler. The post focuses on a recently added feature to Forge, support for Let's Encrypt certificates, and other SSL optimizations.

Laravel Forge was recently to allow one-click installations of Let’s Encrypt certificates. It is now easier than ever to have your own SSL!

Let’s take a few extra minutes to optimize your server and help it perform faster and be more secure. In this tutorial we will look at using SSL session caching, HTTP Strict Transport Security (HSTS), and Hypertext Transfer Protocol 2 (HTTP/2).

The example they give are more Force-centric but the SSL changes and optimizations themselves could be used on any server running Nginx. They talk about:

  • the SSL Log-Jam Fix
  • SSL Optimizations (optimized cypher suite, OCSP stapling)
  • HTTP Strict Transport Security (HSTS)
  • HTTP/2

The post ends with a screenshot of how to test the new configuration and how to restart the web service to put it all into effect. There's also a link to an SSL checker that can help you verify things are set up correctly.

tagged: ssl forge laravel tutorial session cache hsts http2 nginx configuration

Link: https://laravel-news.com/2016/01/optimizing-ssl-laravel-forge/

SitePoint PHP Blog:
Automatic Asset Optimization with Munee
Oct 12, 2015 @ 10:26:42

The SitePoint PHP blog has posted a tutorial showing you how to optimize how your application works with assets with Munee.

Munee is a n asset management tool which can compile LESS, SCSS, or CoffeeScript, manipulate images, minify CSS and JS, and cache assets on the server and client on the fly. It works with PHP 5.3 and newer versions.

In this tutorial, we will learn how Munee makes it easy to include assets in templates, how to install it, how it works and how to use it. Munee is another way to avoid NodeJS in asset management of PHP apps.

He starts the article with a few reasons why you'd want to use Munee to manage your application's assets including automatic minification and both server and client side cache handling. He covers a bit about how it works and what it does to cache assets via simple HTTP headers. He then gets into the actual installation of the tool, the code needed to run it in your system (a one-line call) and how to have the server rewrite all the requests back to the waiting PHP file. He shows how to compile different asset types including SCSS, LESS and CoffeScript files as well as minifying Javascript and CSS. Munee also includes an on-the-fly image resize handler that will also cache the results. Finally he talks about how you can use it to combine assets and briefly about the API the library provides for some other functionality.

tagged: asset optimization munee tutorial css javascript less scss coffeescript cache

Link: http://www.sitepoint.com/automatic-asset-optimization-with-munee/

Tideways.io:
Dodge the thundering herd with file-based Opcache in PHP7
Aug 31, 2015 @ 11:55:37

The Tideways.io site has posted a tutorial showing you how to "avoid the thundering herd" of incoming requests to your application using a file-based PHP 7 opcode cache to reduce load and increase performance on your site.

In the last blog post about Fine-Tuning Opcache Configuration I mentioned the thundering herd problem that affects Opcache during cache restarts. When Opcache is restarted, either automatically or manually, all current users will attempt to regenerate the cache entries. Under load this can lead to a burst in CPU usage and significantly slower requests.

[...] In Rasmus talk at FrOsCon 2015 (Video at 12:30, Slides), he showed the persistent secondary file-based cache Opcache gets in PHP 7. It can read the generated opcodes from disk instead of having to recompile the code after cache restart. This happens only when the compiled opcaches are not found in shared memory.

They talk about the benefits that this caching can provide, not only to web-based applications but also to command line scripts. There's a mention of possible security issues if an attacker is able to read/write to the cache files (but permissions can help that). The post ends with how to install it on your own PHP 7 instance, using the --enable-opcache-file flag on compilation.

tagged: thunderherd opcode cache problem php7 example commandline

Link: https://tideways.io/profiler/blog/dodge-the-thundering-herd-with-file-based-opcache-in-php7

Davey Shafik:
GuzzleHTTP VCR
Aug 24, 2015 @ 10:54:54

Davey Shafik has published a post about library he's created that's a sort of "recorder" for connections made with the Guzzle HTTP client - the Guzzle VCR.

A few days ago I pushed out a very small library to help with testing APIs using Guzzle: dshafik/guzzlehttp-vcr. [...] This is a simple middleware that records a request’s response the first time it’s made in a test, and then replays it in response to requests in subsequent runs.

The handler works by recording the responses from the API (ex: the JSON response data) and records them to files (again, JSON). A one-line call turns the "recording" on and points to a directory where the cached files should be stored. He shows how to use it in the constructor of your Guzzle client, setting it up as the "handler" for the requests. He also includes an example of a few unit tests that make use of the recording feature to check the response of a /test endpoint.

tagged: guzzle http client vcr recording response json cache handler

Link: http://daveyshafik.com/archives/69384-guzzlehttp-vcr.html

Matt Stauffer:
Login Throttling in Laravel 5.1
Aug 03, 2015 @ 08:35:57

Matt Stauffer has posted the eleventh part in his series looking at new features of the latest release of the Laravel framework (well, version 5.1). In this tutorial he shows you how to setup and configure the login throttling for your Laravel-based application with the help of the Laravel Throttle package.

Whether or not you know it, any login forms are likely to get a lot of automated login attempts. Most login forms don't stop an automated attack trying email after email, password after password, and since those aren't being logged, you might not even know it's happening.

The best solution to something like this is to halt a user from attempting logins after a certain number of failed attempts. This is called login throttling, or rate limiting. Graham Campbell wrote a great package called Laravel Throttle to address this in previous versions of Laravel, but in Laravel 5.1 Login throttling comes right out of the box.

He shows how to use the ThrottleTrait in your AuthController to have some of the "behind the scenes" work done for you. He shows you how to update your view to relay the possible error message back to the user (and includes a quick screencast of the result). He ends the post with a quick look at what the throttling functionality is doing under the covers: creating a temporary cache item based on username+IP address as a "lock" indicator. Finally, he points out two properties you can find on the auth controller to give a bit more detail on the current configuration: lockout time and max login attempts.

tagged: laravel login throttle tutorial authcontroller laravelthrottle package cache username ipaddress

Link: https://mattstauffer.co/blog/login-throttling-in-laravel-5.1

Cees-Jan Kiewiet:
Composer cache on Travis
Jul 29, 2015 @ 08:46:52

Cees-Jan Kiewiet has posted an article covering the cache directive on The popular Travis-CI continuous integration service and how it can have an effect on your builds.

Ever since the Test lowest, current, and highest possible on Travis post I wanted to dive into caching composers cache and vendor on Travis. My experiments started the day after that post.

He starts with an example of a simple .travis.yml build configuration that includes the cache directive, showing the caching of entire directories. He points out that, while this can speed up builds, it also comes with a few problems - one being that cache inconsistencies could cause unintended side effects when major changes are made. He points out that most of these risks are worth the gain, though. He's seen a gain of around 40 seconds for a normally 50 second job.

tagged: composer travisci cache configuration caveats

Link: http://blog.wyrihaximus.net/2015/07/composer-cache-on-travis/

SitePoint PHP Blog:
Speeding up Existing Apps with a Redis Cache
Jul 28, 2015 @ 10:27:06

The SitePoint PHP blog has posted a tutorial that want to help you speed up your applications with Redis, adding in caching to help reduce the overall processing load your app has to expend.

The application in question, when executing a query, runs off to Diffbot’s API and makes it query the dataset. The subset is then returned and displayed. This can take up to 5 or so seconds, depending on the busyness of Diffbot’s servers. While the situation will undoubtedly improve as they expand their computational capacity, it would be nice if a query executed once were remembered and reused for 24 hours, seeing as the collection is only refreshed that often anyway.

Considering the fact that implementing this cache costs us literally nothing (and actually reduces costs by reducing strain on the servers), adding it in is an easy win, even if it weren’t used as often as one would hope. There is no reason not to add it – it can only benefit us.

He helps you get Redis up and running as a service on the local system and installing the Predis, the PHP library you'll use to talk with Redis for setting and getting the cached information. He includes a few code snippets showing how to send the search off to the DiffBot API, return the results and push them into the cache as serialized data with a day long timeout. He also mentions the phpiredis extension to reduce some of the overhead that could be cause by using a PHP library versus an extension.

tagged: speed performance redis cache tutorial introduction predis phpiredis

Link: http://www.sitepoint.com/speeding-up-existing-apps-with-a-redis-cache/