The Toptal blog has a post for the Laravel users out there showing you how to handling intensive tasks in your applications. In this case it's creating a background job (run via cron) to import information from Excel spreadsheet formatted file into the database.
When dealing with time consuming resource intensive tasks, most PHP developers are tempted to choose the “quick hack route.” Don’t deny it! We’ve all used ini_set('max_execution_time', HUGE_INT); before, but it doesn’t have to be this way.In today’s tutorial, I demonstrate how an application’s user experience may be improved (with minimal developer effort) by separating long-running tasks from the main request flow using Laravel. By making use of PHP’s ability to spawn separate processes that run in the background, the main script will respond faster to user action.
He starts with why PHP isn't particularly a good choice for long running requests and how making users wait for it to complete is a bad thing. He then walks you through the setup of a basic Laravel application that includes the maatwebsite/excel
library for the Excel handling. He shows his configuration setup, both in the Nginx and Laravel side, to handle serving up the app. He uses Laravel migrations to set up the database and the models, routing and logic (controller) to handle the incoming Excel file for import. With this in place he then creates the command type to process the file and save the information it contains to the database. Finally, he ends the post with the cron configuration you'll need to handle the import, running it nightly at midnight.