How to properly run a query that needs a lot of Processing without Getting Error

I am working on an Online E-Learning website with Laravel 5.8 and I need to run a query for updating exam results of users that have been participated in the exam.

Here is the Controller method for updating exam scores:

public function compute($oleId)
    {
        try {
            ini_set('memory_limit', '-1');
            set_time_limit(0);

            DB::beginTransaction();

            /* Get the Exam */
            $exam = OlympiadExam::query()->where('ole_id', $oleId)->first();

            /* Calculate the score of Exam */
            if ($exam->ole_is_main == '0') {
                foreach ($exam->olympiadExamExecution as $execution) {
                    $questions = $execution->load('olympiadExamExecutionQuestion.olympiadExamQuestion');
                    $all = $exam->ole_question_count;
                    $notAnswered = $questions->olympiadExamExecutionQuestionNotAnswered->where('oee_oex_id', $execution->oex_id)->count();
                    $answered = $all - $notAnswered;
                    $truthy = $questions->olympiadExamExecutionQuestionTruthy->where('oee_oex_id', $execution->oex_id)->count();
                    $falsy = $all - ($truthy + $notAnswered);

                    $score = (float)($truthy - ($falsy / 3));
                    $percentage = (float)((($truthy * 3) - $falsy) / ($all * 3)) * 100;

                    $prePositive = (float)$percentage * ($exam->ole_percent_effect / 100);
                    $percentPositive = ($prePositive > 0) ? $prePositive : 0;
                    $percentFinal = (($percentage + $percentPositive) > 100) ? 100 : ($percentage + $percentPositive);
                    $scoreFinal = ((($percentFinal * $all) / 100) > $all) ? $all : ($percentFinal * ($all / 100));

                    $examResult = [
                        'oex_correct_answer_count' => $truthy,
                        'oex_wrong_answer_count' => $falsy,
                        'oex_no_answer_count' => $notAnswered,
                        'oex_score' => $score,
                        'oex_percent' => $percentage,
                        'oex_percent_positive' => $percentPositive,
                        'oex_percent_final' => $percentFinal,
                        'oex_score_final' => $scoreFinal,
                    ];

                    OlympiadExamExecution::query()->where('oex_id', $execution->oex_id)->update($examResult);
                }
            }

            $candidates = OlympiadExamExecution::query()->where('oex_ole_id', $oleId)->pluck('oex_percent_final')->toArray();
            if (!empty($candidates)) {
                $this->computeRank($candidates);
            }
            DB::commit();
            session()->flash('computeStatus', 'Process for calculating score completed');
            return redirect()->back();
        } catch (\Exception $exception) {
            DB::rollBack();
            session()->flash('computeStatus', 'Process for calculating score is going wrong');
            return redirect()->back();
        }

    }

Now this method works fine with for a few users that have been participated in the exam but does not work out for large number of users (about 500 users).

Therefore I tried setting ini_set('memory_limit', '-1'); and set_time_limit(0); before the query runs but still does not workout and shows this message:

enter image description here

So I wonder what's going wrong that cause this error?

How can I properly make this processing works for large number of users?

I would REALLY appreciate any idea or suggestion from you guys because my life depends on this...


There is a chunk method in laravel for queuing large data. You can chunk the data and try importing datas Here is the link for reference: here

I hope this link will help you. Here is what documentation says about it.

If you need to work with thousands of database records, consider using the chunk method provided by the DB facade. This method retrieves a small chunk of results at a time and feeds each chunk into a closure for processing. For example, let's retrieve the entire users table in chunks of 100 records at a time: