July 9, 2024 · 4 min read · 588 views
Imagine you're tasked with importing the data from an online application system of a school. As students submit their applications, the data is stored in an Excel file that needs to be imported into your Laravel application. In this article, we'll walk you through the process of efficiently handling large Excel files using Filament and Laravel.
For this project, we will use the Spatie\SimpleExcel package to read the data from our Excel file and make the needed changes in the backend. This package also handles CSV files. Make sure to install the package using the command: composer require spatie/simple-excel
. The package offers many helpful features for complex tasks, but we'll focus on a simpler example in this article.
To begin the data processing task, we'll make an action button on our Filament Application Resource page. This button, Student Applications Import
, will handle the import process. For simplicity, we have removed the rest of the code from the Resource page, like the form and table. We have put the button in the table header through the headerActions
.
<?php
use App\Jobs\ApplicationsImportsJob;
class ApplicationResource extends Resource
{
protected static ?string $model = Application::class;
public static function form(Form $form): Form
{}
public static function table(Table $table): Table
{
return $table
->columns([])
->filters([])
->actions([])
->headerActions([
Tables\Actions\Action::make('Import Applications')
->form([
FileUpload::make('xlsxFile')
->label('XLSX File')
->disk('local')
->required()
->directory('imports'),
])->action(function (array $data) {
if ($data['xlsxFile'] !== null) {
$filePath = $data['xlsxFile'];
ApplicationsImportsJob::dispatch($filePath);
}
})
]);
}
}
Next, we need to create a job to handle the processing of the imported data. In our case, this job is named ApplicationsImportsJob
. The job will read the Excel file, chunk the data into manageable pieces, and dispatch a new job for each chunk. We've set the chunks into 500 rows each, and this has worked well in our project, but feel free to experiment with it.
<?php
namespace App\Jobs;
use Illuminate\Bus\Queueable;
use Illuminate\Contracts\Queue\ShouldQueue;
use Illuminate\Foundation\Bus\Dispatchable;
use Illuminate\Queue\InteractsWithQueue;
use Illuminate\Queue\SerializesModels;
use Spatie\SimpleExcel\SimpleExcelReader;
class ApplicationsImportsJob implements ShouldQueue
{
use Dispatchable, InteractsWithQueue, Queueable, SerializesModels;
public function __construct(public $filename)
{
//
}
public function handle(): void
{
SimpleExcelReader::create(storage_path('app/' . $this->filename))
->getRows()
->chunk(500)
->each(fn ($chunk) => ApplicationsImportChunkJob::dispatch($chunk, $this->filename));
}
}
Given the large amount of data in the Excel file, it's important to break the data into manageable chunks for efficient processing. This job will iterate through the 500 rows in the chunk and create a new application for each student. It will save the name, phone and address as found by the appropriate column name in the Excel file header. The job will be attempted 5 times before it's marked as failed.
<?php
namespace App\Jobs;
use App\Models\Application;
use Illuminate\Bus\Queueable;
use Illuminate\Contracts\Queue\ShouldQueue;
use Illuminate\Foundation\Bus\Dispatchable;
use Illuminate\Queue\InteractsWithQueue;
use Illuminate\Queue\SerializesModels;
class ApplicationsImportChunkJob implements ShouldQueue
{
use Dispatchable, InteractsWithQueue, Queueable, SerializesModels;
public $tries = 5;
/**
* Create a new job instance.
*/
public function __construct(public $chunk, public $file)
{
//
}
/**
* Execute the job.
*/
public function handle(): void
{
foreach ($this->chunk as $row) {
Application::updateOrInsert([
'name' => $row['name'],
'phone' => $row['phone'],
'address' => $row['address'],
]);
}
}
}
After setting up the Laravel jobs for processing large Excel files, the next step is running these jobs to start the data processing. Running the jobs involves executing the code that dispatches the jobs and allows Laravel to handle them in the background, without slowing down the main application.
In Laravel, queues are used to handle time-consuming tasks asynchronously. When a job is dispatched, it is added to a queue where it waits to be processed by a queue worker. Read the article Laravel 11 for Beginners: Using Queues if you want to learn how to setup the queues locally. To process the jobs in your queue, you need to run the queue worker: php artisan queue:work
.
On Laravel Forge, navigate to the Queues section and only change the Maximum Seconds Per Job to 300
, that's more than enough to wait for a job to be finished. Clicking Create
will launch your queue worker.
It's essential to configure the queue correctly to ensure that your jobs will run. Within your project's .env section, set your QUEUE_CONNECTION to redis
to make use of its fast, in-memory data store capabilities. Here's a snippet for clarity:
QUEUE_CONNECTION=redis
REDIS_DATABASE=0
If you're running multiple sites on the same server, assign each one to a distinct Redis database to prevent any overlap. Simply increment the REDIS_DATABASE number accordingly (e.g., 0, 1, 2, etc.).
By following these steps and using Laravel and the Spatie\SimpleExcel package, you can process large Excel files on your Filament application efficiently. Breaking the data into manageable chunks and handling them with dedicated jobs ensures a smooth and effective import process.
If you need help with a Laravel project let's get in touch.
Lucky Media is proud to be recognized as a leading Laravel Development Agency
Technologies:
Related Posts
Stay up to date
Be updated with all news, products and tips we share!