Import CSV file to Laravel controller and insert data to two tables

There are 3 steps to read CSV file and import it in database in Laravel.

  1. Read CSV file
  2. Convert it to array
  3. Finally create records in our database.

Before we start, I have created a sample test.csv file and put it on my public folder under file folder:

name,email,password
user1,[email protected],pasxxxxxxxxxword
user2,[email protected],pasxxxxxxxxxword
user3,[email protected],pasxxxxxxxxxword

Step 1 and 2; I created a helper function called csvToArray, I just put it in my controller for now (this function is inspired from this link) it simply reads the CSV file and convert it to array:

function csvToArray($filename = '', $delimiter = ',')
{
    if (!file_exists($filename) || !is_readable($filename))
        return false;

    $header = null;
    $data = array();
    if (($handle = fopen($filename, 'r')) !== false)
    {
        while (($row = fgetcsv($handle, 1000, $delimiter)) !== false)
        {
            if (!$header)
                $header = $row;
            else
                $data[] = array_combine($header, $row);
        }
        fclose($handle);
    }

    return $data;
}

Step 3; And here is my final step, read array and insert it in our database:

public function importCsv()
{
    $file = public_path('file/test.csv');

    $customerArr = $this->csvToArray($file);

    for ($i = 0; $i < count($customerArr); $i ++)
    {
        User::firstOrCreate($customerArr[$i]);
    }

    return 'Jobi done or what ever';    
}

Note: this solution assume that you have a model in your Laravel project and has the proper table in your database.

if you use dd($customerArr) you will get this
enter image description here


@maytham The solution by Maytham will work well. However, it will have a huge problem if you're trying to do with large data. Even if you do 1000 rows, it will create a problem, as it will make 1000 insert statement separately. I will just edit the third method by him and add my own input

public function importCsv()
{
    $file = public_path('file/test.csv');

    $customerArr = $this->csvToArray($file);
    $data = [];
    for ($i = 0; $i < count($customerArr); $i ++)
    {
        $data[] = [
            'column_name1' => 'value',
            'column_name2' => 'value2',
            .. so..on..and..on
        ];
        //User::firstOrCreate($customerArr[$i]);
    }
    DB::table('table_name')->insert($data);
    return 'Jobi done or what ever';    
}

This will call the database once to insert as many rows as you wish. Be it 1000, 100000 or whatever. However, if you have a huge csv, this will be a problem too since you will be required to insert in chunks. Like in PostgreSQL, I have noticed that you can insert up to 65000 something rows in one statement. Maybe I am wrong about the number but there's a limit in every database and you need to look for that.


In your store() method, create the record in your lists table, and then iterate over the contents of the CSV file and insert the data into the customers table. You should create a relation between customers and lists for this purpose. You would also be better off using something like the PHP League’s CSV package for reading such files:

public function store(AddCustomersRequest $request)
{
    // Get uploaded CSV file
    $file = $request->file('csv');

    // Create list name
    $name = time().'-'.$file->getClientOriginalName();

    // Create a list record in the database
    $list = List::create(['name' => $name]);

    // Create a CSV reader instance
    $reader = Reader::createFromFileObject($file->openFile());

    // Create a customer from each row in the CSV file
    foreach ($reader as $index => $row) {
        $list->customers()->create($row);
    }

    // Redirect back to where you need with a success message
}