Toggle navigation
MeasureThat.net
Create a benchmark
Tools
Feedback
FAQ
Register
Log In
Array filtering performance
(version: 0)
Comparing performance of:
Array.filter vs For loop
Created:
3 years ago
by:
Registered User
Jump to the latest result
Script Preparation code:
var arr = [] for (var i = 0; i < 1_000_000; i++) { arr.push({id: i}) }
Tests:
Array.filter
var result = arr.filter(x => x.id % 2 === 0)
For loop
var result = [] var l = arr.length for (var i = 0; i < l; i++) { var value = arr[i] if (value.id % 2 === 0) { result.push(value) } }
Rendered benchmark preparation results:
Suite status:
<idle, ready to run>
Run tests (2)
Previous results
Fork
Test case name
Result
Array.filter
For loop
Fastest:
N/A
Slowest:
N/A
Latest run results:
No previous run results
This benchmark does not have any results yet. Be the first one
to run it!
Autogenerated LLM Summary
(model
llama3.2:3b
, generated one year ago):
Let's break down the benchmark and explain what's being tested. **Benchmark Overview** The benchmark is designed to measure the performance of two approaches: using `Array.filter()` and using a traditional for loop to filter an array of objects. **Approach 1: Array.filter()** This approach uses the `Array.filter()` method, which returns a new array with all elements that pass the test implemented by the provided function. In this case, the test checks if each element's `id` property is even (i.e., `x.id % 2 === 0`). The pros of using `Array.filter()` are: * Concise and expressive syntax * Efficient use of CPU resources (as it only iterates over elements that pass the test) However, there might be some overhead due to the creation of a new array. Cons of this approach are relatively low compared to other possible methods. **Approach 2: For loop** This approach uses a traditional for loop to iterate over each element in the array and push it to a result array if the condition is met (i.e., `value.id % 2 === 0`). The pros of using this approach are: * Lower memory overhead compared to creating a new array * More control over the iteration process However, this approach can be less efficient due to the overhead of managing the loop variables and conditional statements. Cons of this approach include: * Less concise and verbose syntax * Can lead to slower performance if not implemented carefully **Other Considerations** * **Cache locality**: Both approaches might have issues with cache locality due to the large size of the input array (1 million elements). This could potentially slow down the execution. * **Type coercion**: The test does not explicitly mention it, but JavaScript will perform type coercion when comparing `x.id` with an integer literal. If `x.id` is a string or another type that can be coerced to an integer, this might affect performance. **Library/Additional Features** There are no external libraries used in the benchmark besides standard JavaScript features and built-in array methods. **Special JS Feature/Syntax** There are no special JavaScript features or syntax mentioned in the benchmark. The code only uses standard JavaScript syntax and features. **Alternatives** Other alternatives for filtering large arrays could include: * Using `Array.prototype.some()` method, which returns true if at least one element of the array passes the test. * Using Web Workers to parallelize the filtering process across multiple CPU cores. * Using a library like Lodash or Underscore.js, which provides optimized array filtering methods. Note that the choice of approach depends on the specific use case and requirements. The `Array.filter()` method is often a good default choice for simple filtering tasks due to its concise syntax and efficient performance. However, for more complex scenarios or when low memory overhead is crucial, alternative approaches might be preferred.
Related benchmarks:
Array .push() vs .unshift(), 1M elements
array last element big data
Some vs. Filter vs. Find, 10 vs 10k
Array.push(x) vs array[n]=x
Comments
Confirm delete:
Do you really want to delete benchmark?