Toggle navigation
MeasureThat.net
Create a benchmark
Tools
Feedback
FAQ
Register
Log In
https://stackoverflow.com/posts/61510307
(version: 0)
Benchmark for transforming merged objects
Comparing performance of:
generator vs forloop object assign vs mapReduceSet vs forloopEntries
Created:
6 years ago
by:
Guest
Jump to the latest result
Script Preparation code:
var nrows = 500000, nyears = [...Array(20).keys()].map(x => 2000 + x), nages = [...Array(20).keys()].map(x => `${18+x}`), npeople = [...Array(nrows / nyears.length / nages.length | 0).keys()].map(x => `Person${x}`); var getOneRotate = (x, i) => x[(i / x.length | 0) % x.length]; var data = Array(nrows).fill(0).map((x, i) => ({ year: getOneRotate(nyears,i), name: getOneRotate(npeople,i), age: getOneRotate(nages,i), value: (i / nyears.length | 0) % nyears.length })); const NA='NA'; function forloopObjectAssign(x) { let lname, lage, res=[], years = {}; for ( var i = 0 ; i < x.length ; i++ ) { const {year, value, name, age} = x[i]; if ( lname != name || lage != age ) res.push({name, age}); res[ res.length-1 ][year] = value; years[year] = NA; lname = name; lage = age; } return res.map( x => Object.assign({...years}, x) ); } function forloopEntries(x) { let lname, lage, res=[], years = {}; for ( var i = 0 ; i < x.length ; i++ ) { const {year, value, name, age} = x[i]; if ( lname != name|| lage != age ) res.push([['name',name],['age',age]]); res[ res.length-1 ].push([year,value]); years[year] = NA; lname = name; lage = age; } return res.map( x => Object.fromEntries(Object.entries(years).concat(x)) ); } function mapReduceSet(data) { years = new Set; result = Object .values(data.reduce((r, { name, year, value, ...o }) => { r[name] = r[name] || { name, ...o }; r[name][year] = value; years.add(year); return r; }, {})) .map( (y => o => ({ ...y, ...o })) (Object.fromEntries([...years].map(y => [y, NaN]))) ); return result; } let yearsGenerator = {} function* groupByName(entries) { let group = null; for (const {name, year, value, age} of entries) { if (group === null || group.name !== name || group.age !== age) { if (group !== null) { yield group; } group = {name, age}; } group[year] = value; yearsGenerator[year] = 'NA'; } if (group !== null) { yield group; } } function generator(entries) { return [...groupByName(entries)].map(x=>Object.assign({...yearsGenerator},x)) }
Tests:
generator
generator(data)
forloop object assign
forloopObjectAssign(data)
mapReduceSet
mapReduceSet(data)
forloopEntries
forloopEntries(data)
Rendered benchmark preparation results:
Suite status:
<idle, ready to run>
Run tests (4)
Previous results
Fork
Test case name
Result
generator
forloop object assign
mapReduceSet
forloopEntries
Fastest:
N/A
Slowest:
N/A
Latest run results:
No previous run results
This benchmark does not have any results yet. Be the first one
to run it!
Autogenerated LLM Summary
(model
llama3.2:3b
, generated one year ago):
I'll provide an explanation of the benchmark, options being compared, pros and cons of each approach, and other considerations. **Benchmark Overview** The benchmark measures performance of three different approaches for transforming merged objects: `forloopObjectAssign`, `generator`, and `mapReduceSet`. The test data consists of a large array of objects with multiple properties, which are then processed using each transformation function. **Options Being Compared** 1. **forloopObjectAssign**: This approach uses a traditional loop to iterate over the object's entries and assign values. 2. **Generator**: This approach uses a generator function (`groupByName`) to create an iterable sequence of objects with transformed data. 3. **mapReduceSet**: This approach uses the `Array.prototype.map` method in combination with the `Array.prototype.reduce` method to transform the data. **Pros and Cons of Each Approach** 1. **forloopObjectAssign**: * Pros: Simple, straightforward implementation; no additional dependencies required. * Cons: Potential for performance issues due to overhead of loop iteration; may not be optimized for large datasets. 2. **Generator**: * Pros: Efficient use of iterators and generators; can take advantage of lazy evaluation; flexible implementation. * Cons: May require more expertise in JavaScript generators; potential performance overhead due to creation and destruction of iterator objects. 3. **mapReduceSet**: * Pros: Optimized for large datasets, using built-in array methods; potentially faster execution time. * Cons: Requires understanding of Array.prototype methods; may not be as straightforward to implement. **Other Considerations** * **Library usage**: The `mapReduceSet` approach uses the `Array.prototype.map` and `Array.prototype.reduce` methods, which are built into JavaScript. No additional libraries are required. * **Special JS features**: The benchmark uses modern JavaScript features such as generators (in the Generator implementation) and arrow functions ( implicit in the `mapReduceSet` implementation). **Alternative Approaches** Other possible approaches for transforming merged objects include: 1. Using a library like Lodash or Underscore.js, which provide optimized implementations of object transformation functions. 2. Implementing a custom transformer function using a different programming paradigm, such as functional programming. Keep in mind that the choice of approach depends on the specific requirements and constraints of the project, including performance considerations, code maintainability, and readability.
Related benchmarks:
Merge array of objects with an object
Merge array of objects with an object.
Take two arrays and merge them using an object key (Map vs. object)
Array push vs spread concat'''
Merge resources: Object.map
Comments
Confirm delete:
Do you really want to delete benchmark?