Toggle navigation
MeasureThat.net
Create a benchmark
Tools
Feedback
FAQ
Register
Log In
Restoring actual data
(version: 0)
Comparing performance of:
forEach + find + push vs Reduce + find vs map + find + filter
Created:
2 years ago
by:
Registered User
Jump to the latest result
Tests:
forEach + find + push
const responseData = new Array(100).fill(null).map((_, index) => ({ contract: `contract_${index}`, token_id: index, hash: `hash_${index}`, })); const cachedData = new Array(100).fill(null).map((_, index) => ({ contract: `contract_${index % 2}`, token_id: index, hash: `hash_${index}`, nft: index % 2 === 0 ? { data: `NFT data for ${index}` } : null, })); const readyForRender = []; const toUpdate = []; responseData.forEach(item => { const actualCachedData = cachedData.find( cache => cache.contract === item.contract && cache.token_id === item.token_id && cache.hash === item.hash ); if (actualCachedData && actualCachedData.nft) { readyForRender.push(actualCachedData); } else { toUpdate.push(item); } });
Reduce + find
const responseData = new Array(100).fill(null).map((_, index) => ({ contract: `contract_${index}`, token_id: index, hash: `hash_${index}`, })); const cachedData = new Array(100).fill(null).map((_, index) => ({ contract: `contract_${index % 2}`, token_id: index, hash: `hash_${index}`, nft: index % 2 === 0 ? { data: `NFT data for ${index}` } : null, })); const { readyForRender, toUpdate } = responseData.reduce((accumulator, item) => { const actualCachedData = cachedData.find(cache => cache.contract === item.contract && cache.token_id === item.token_id && cache.hash === item.hash ); if (actualCachedData && actualCachedData.nft) { return { ...accumulator, readyForRender: [...accumulator.readyForRender, actualCachedData], }; } else { return { ...accumulator, toUpdate: [...accumulator.toUpdate, item], }; } }, { readyForRender: [], toUpdate: [] });
map + find + filter
const responseData = new Array(100).fill(null).map((_, index) => ({ contract: `contract_${index}`, token_id: index, hash: `hash_${index}`, })); const cachedData = new Array(100).fill(null).map((_, index) => ({ contract: `contract_${index % 2}`, token_id: index, hash: `hash_${index}`, nft: index % 2 === 0 ? { data: `NFT data for ${index}` } : null, })); const loadedIndices = responseData.map((item, index) => { const actualCachedData = cachedData.find(cache => cache.contract === item.contract && cache.token_id === item.token_id && cache.hash === item.hash ); return actualCachedData && actualCachedData.nft ? index : null; }).filter(index => index !== null); const readyForRender = loadedIndices.map(index => cachedData[index]); const toUpdate = responseData.filter((_, index) => !loadedIndices.includes(index));
Rendered benchmark preparation results:
Suite status:
<idle, ready to run>
Run tests (3)
Previous results
Fork
Test case name
Result
forEach + find + push
Reduce + find
map + find + filter
Fastest:
N/A
Slowest:
N/A
Latest run results:
Run details:
(Test run date:
2 years ago
)
User agent:
Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/122.0.0.0 Safari/537.36
Browser/OS:
Chrome 122 on Mac OS X 10.15.7
View result in a separate tab
Embed
Embed Benchmark Result
Test name
Executions per second
forEach + find + push
33438.6 Ops/sec
Reduce + find
33113.2 Ops/sec
map + find + filter
41510.7 Ops/sec
Autogenerated LLM Summary
(model
llama3.2:3b
, generated one year ago):
The provided benchmark measures the performance of three different approaches for filtering and processing data in JavaScript: 1. **`map`, `find`, and `filter`**: This approach uses the `map` method to create a new array with the filtered elements, then uses the `find` method to filter out elements that don't meet the condition, and finally uses the `filter` method to remove duplicates. 2. **`forEach`, `find`, and `push`**: This approach uses the `forEach` method to iterate over the data, finds the matching cached data using `find`, and then pushes it into a new array. 3. **`Reduce`**: This approach uses the `reduce` method to process each element in the data, finding the matched cached data along the way. Now, let's discuss the pros and cons of each approach: **1. `map`, `find`, and `filter`**: Pros: * Clear and concise code * Easy to read and understand Cons: * May have higher overhead due to creating new arrays and iterating over the original data twice. * Can be slower for very large datasets. **2. `forEach`, `find`, and `push`**: Pros: * More intuitive and natural flow of execution, as it iterates over the data once. * Less overhead compared to creating new arrays. Cons: * Code can become less readable and harder to understand due to using a loop and pushing elements into an array. * May require additional checks for edge cases or empty arrays. **3. `Reduce`**: Pros: * Can be more efficient, as it only iterates over the data once and accumulates the results. * Can be easier to implement for some use cases, as it's a natural fit for reducing an array. Cons: * Code can become less readable and harder to understand due to using a complex reduction operation. * May require additional checks for edge cases or empty arrays. Other considerations: * The `map`, `find`, and `filter` approach is generally considered the most JavaScript-like way of solving this problem, making it easier for developers familiar with this syntax to understand and implement. * The `forEach`, `find`, and `push` approach can be a good choice when you need to process data in a more sequential manner or when working with older browsers that don't support `map`. * The `Reduce` approach is suitable for problems that require accumulating results from an array, such as calculating the sum of elements. Alternatives: * If you're dealing with large datasets and need optimal performance, consider using a more specialized library like Lodash or Ramda, which provide optimized functions for common use cases. * For even better performance, consider using Web Workers or parallel processing to take advantage of multiple CPU cores. * When working with older browsers or environments that don't support modern JavaScript features, consider using polyfills or transpilers to ensure compatibility. In conclusion, the choice of approach depends on your specific requirements, such as readability, performance, and compatibility.
Related benchmarks:
Lodash get() comparison
Map vs Object: Lookup, insert and delete
deepCopy perf test
Lodash cloneDeep vs structuredClone vs JSON Parse vs JSON indirect parse vs in-house deepcopy
Test 6 copy methods - big data
Comments
Confirm delete:
Do you really want to delete benchmark?