Toggle navigation
MeasureThat.net
Create a benchmark
Tools
Feedback
FAQ
Register
Log In
Array processing benchmark
(version: 0)
Quick benchmark
Comparing performance of:
reduce with new array vs reduce with array mutation vs reduce with array concat vs for vs for-of vs map vs forEach vs values (iterates over for-of)
Created:
3 years ago
by:
Guest
Jump to the latest result
Script Preparation code:
var arr = Array(10_000).fill(0)
Tests:
reduce with new array
arr.reduce((acc, x) => [...acc, x, x], [])
reduce with array mutation
arr.reduce((acc, x) => { acc.push(x); return acc; }, [])
reduce with array concat
arr.reduce((acc, x) => acc.concat(x), [])
for
const newArray = []; for (let i = 0; arr.length > i; i++) { newArray.push(arr[i]); }
for-of
const newArray = []; for (let x of arr) { newArray.push(x); }
map
arr.map(x => x);
forEach
const newArray = []; arr.forEach(x => newArray.push(x));
values (iterates over for-of)
const newArray = []; for (let x of arr.values()) { newArray.push(x); }
Rendered benchmark preparation results:
Suite status:
<idle, ready to run>
Run tests (8)
Previous results
Fork
Test case name
Result
reduce with new array
reduce with array mutation
reduce with array concat
for
for-of
map
forEach
values (iterates over for-of)
Fastest:
N/A
Slowest:
N/A
Latest run results:
No previous run results
This benchmark does not have any results yet. Be the first one
to run it!
Related benchmarks:
Array .push() vs .unshift(), 100K elements
`Array.slice(-1)[0]` vs `Array[Array.length]` for 10000 length
array last element big data
arr.at(-1) vs arr[arr.length - 1]
shallow copy of 6M elements array
Comments
Confirm delete:
Do you really want to delete benchmark?