How to speed up the asynchronous multiple request in a single fetch link?

The code that I will post is working properly but the only problem that I did encounter, it is not yet optimized for fetching a lot of chunks data it tends to overload/lag. What kind of fetching should I use that is optimize for fetching a lot of data? I am clueless now. Thanks for the help guys ! 🙂 Note: Backend: PHP 5.6 Databse: MongoDB 5.6
24 Replies
rig26.
rig26.•16mo ago
<table id="tbl-customers" >
<thead >
<!-- Remove the Pagination Buttons in Header ('Modified By: Ron Ivin Gregorio - 02/20/2023') -->
<tr>
<th>No.</th>
<th id="sort-registration_date" class="sort" title="click to sort by registration date" style='max-width:90px'>Registration Date</th>
<th id="sort-name" class="sort" title="click to sort by student name">Name</th>
<th id="sort-dlnum" class="sort" title="click to sort by driver's license number">Driver's License</th>
<th>Student ID</th>
<th>Course</th>
<th>Class</th>
<th>Affiliate</th>
<th>Company</th>
<th>Status</th>
</tr>
</thead>
<tbody class='customers-list' >
<!-- Remove the parsing of data directly in tbody ('Modified By: Ron Ivin Gregorio - 02/20/2023') -->
</tbody>
</table>
<table id="tbl-customers" >
<thead >
<!-- Remove the Pagination Buttons in Header ('Modified By: Ron Ivin Gregorio - 02/20/2023') -->
<tr>
<th>No.</th>
<th id="sort-registration_date" class="sort" title="click to sort by registration date" style='max-width:90px'>Registration Date</th>
<th id="sort-name" class="sort" title="click to sort by student name">Name</th>
<th id="sort-dlnum" class="sort" title="click to sort by driver's license number">Driver's License</th>
<th>Student ID</th>
<th>Course</th>
<th>Class</th>
<th>Affiliate</th>
<th>Company</th>
<th>Status</th>
</tr>
</thead>
<tbody class='customers-list' >
<!-- Remove the parsing of data directly in tbody ('Modified By: Ron Ivin Gregorio - 02/20/2023') -->
</tbody>
</table>
This is the table where I want to append all data that will be get in the server
rig26.
rig26.•16mo ago
Pastebin
Fetching of Data - Pastebin.com
Pastebin.com is the number one paste tool since 2002. Pastebin is a website where you can store text online for a set period of time.
rig26.
rig26.•16mo ago
This is the way how I fetch data in the database
<?php
// error_reporting(E_ALL);
// ini_set('display_errors', '1');
$root = '../../';

// security check
include_once $root . '_security/security_check.php';
use Customers\Customers;
use User\User;

// Code for getting the list of students and student number ('Added By: Ron Ivin Gregorio - 02/20/2023')
if($_SERVER['REQUEST_METHOD'] == 'POST'){
require_once $root . 'classes/Customers.php';
require_once $root . "classes/User.php";

$custmgt = new Customers;
$user = new User($custmgt->db);
$post = json_decode(file_get_contents("php://input"),true);

$sort = $post['sort'];
$order = $post['order'];
$search = $post['search'];
$asearch = $post['advanced_search'];
$skip = $post['skip'];

$customers = $custmgt->getCustomers($user, $skip, $search, $asearch, $sort, $order);
$data = $customers['customers'];
$customer_length = $customers['customer_length'];
echo json_encode(['status' => true, 'data' => $data, 'c_length' => $customer_length]);
}
<?php
// error_reporting(E_ALL);
// ini_set('display_errors', '1');
$root = '../../';

// security check
include_once $root . '_security/security_check.php';
use Customers\Customers;
use User\User;

// Code for getting the list of students and student number ('Added By: Ron Ivin Gregorio - 02/20/2023')
if($_SERVER['REQUEST_METHOD'] == 'POST'){
require_once $root . 'classes/Customers.php';
require_once $root . "classes/User.php";

$custmgt = new Customers;
$user = new User($custmgt->db);
$post = json_decode(file_get_contents("php://input"),true);

$sort = $post['sort'];
$order = $post['order'];
$search = $post['search'];
$asearch = $post['advanced_search'];
$skip = $post['skip'];

$customers = $custmgt->getCustomers($user, $skip, $search, $asearch, $sort, $order);
$data = $customers['customers'];
$customer_length = $customers['customer_length'];
echo json_encode(['status' => true, 'data' => $data, 'c_length' => $customer_length]);
}
This is what the data fetch in the database.
rig26.
rig26.•16mo ago
https://pastebin.com/7VQRAyzq This is the Customers Class and the function and query I am trying to fetch
Pastebin
Customer Class - Pastebin.com
Pastebin.com is the number one paste tool since 2002. Pastebin is a website where you can store text online for a set period of time.
rig26.
rig26.•16mo ago
hello @jochemm can you take a look on it? I will try to explain it further
Jochem
Jochem•16mo ago
where does it hang? I don't really have the time to look at it in detail, but I don't necessarily see anything wrong?
rig26.
rig26.•16mo ago
yes there is no wrong on it as of now but the problem is it tends to lag the server it is not yet efficient I don't what question should I google Proper question I mean If you will try to take a peak on my js I did do while for fetching the same link multiple times I dont know it if its well optimize or not
Jochem
Jochem•16mo ago
you're awaiting the fetch, so it won't run more than one at a time. If it's lagging the server, it might be because of the backend and not the frontend
rig26.
rig26.•16mo ago
Do you know how can I put delay on it?
Jochem
Jochem•16mo ago
await new Promise(r => setTimeout(r, 2000));
await new Promise(r => setTimeout(r, 2000));
that will wait for 2 seconds (2000ms) and then continue to the next line that's probably the quickest, easiest way to just introduce a delay but again, if your server is getting overloaded, this code running on one browser isn't going to be the issue. If there's a lot of rows in that customer table, fetching rows further and further down is going to take longer and longer, depending on how the table's been indexed, how many joins there are in the query...
rig26.
rig26.•16mo ago
There a lot of joins and query I did in backend ( Not me ) but I am just continuing it because the first programmer doesnt want me to altered it So doing that kind of fetch doesnt affect the code right?
Jochem
Jochem•16mo ago
not sure what you mean by that?
rig26.
rig26.•16mo ago
as you can see on how I did fetch it I did a do while loop take a look at this
Jochem
Jochem•16mo ago
okay, but what do you mean by "affect the code"?
rig26.
rig26.•16mo ago
I did it because I want to to get the customer length I mean the running time to load the customers I am iterating on a single link only trying to get the customer length at first
Jochem
Jochem•16mo ago
ah, the while loop doesn't change how long each fetch will take, no as for getting the total customer count, this is a really inefficient way of getting that. The server can do that much more quickly if it has an endpoint for that, or if it just includes that data in each returned request
rig26.
rig26.•16mo ago
each fetch does only get 20 customers then it skips the data in the query for example at first it is skip(20) then skip (40) then and so forth the counter I did is the while loop
Jochem
Jochem•16mo ago
if you can't change the backend, then you're stuck doing it this way, but it's a problem that's much better solved on the server
rig26.
rig26.•16mo ago
I can change some backends but not totally alter the criteria of getting the data so what can you suggest on getting the length of customer? mongodb is my database php as backend
Jochem
Jochem•16mo ago
doing it on the server is the correct solution. If you want to optimize this, and you're discarding the fetched customer data anyway, you could increase the skip a lot, and then backtrack in large steps until you find the correct last page... Say you have 1400 records, and you need to count. First fetch for display data gives you 20 records, so you know there's at least 20. The second fetch sets skip to 1000. You still get 20, so you know there's at least 1000 records. Third, you set it it 2000 (or 10,000, how big these steps are should be determined by the size of your dataset), but you get 0 results, so you know there's less than 2000 You can then take half the difference, and check at 1500, still 0, so then half that difference with the last time you got results, and you get 20 back at skip 1250. Then you go up again, to halfway between 1250 and 1500, so at 1375 you still get 20 results. 1500-1375 is 125, so you check at 1437 and get 0 You can keep halving the distance you check, until you get a result. If you tune it correctly, you could go from 70 fetches to find 1400 rows, to maybe half a dozen? but that's definitely a hack, and the answer should really be to ask the backend dev to provide you with that count, either in each query, or from a separate endpoint
rig26.
rig26.•16mo ago
So your basically trying to tell me to not get the length of the customer but rather guess it using algorithm or what
Jochem
Jochem•16mo ago
guess is a big word, you just use a different search algorithm to get to the correct answer you'll have to make a sane guess for the average number of rows returned. you pretty much double the difference between your first guess and the next every time you find rows still, and halve the difference every time you don't, and you can zero in on the exact number pretty quickly. you still end up with the exact correct number in the end, but instead of downloading all your customers every time, doing lots of calls, you try to limit the number of calls you do and only take samples don't get me wrong though, it's still a bad solution to the problem, it'll just be a quicker bad solution than the naive linear approach
rig26.
rig26.•16mo ago
Yeah thanks for the idea hello jochem do you know if async can tend to lag the dom whenever appending data in the table?
Jochem
Jochem•16mo ago
it shouldn't