Toggle navigation
MeasureThat.net
Create a benchmark
Tools
Feedback
FAQ
Register
Log In
set.add vs array.push for massive objects
(version: 0)
Comparing performance of:
array.push vs set.add
Created:
2 years ago
by:
Guest
Jump to the latest result
Tests:
array.push
const massiveString = new Array(10000).fill("s").join("") const myMassiveObject = new Array(10000).fill(1).reduce((p, v, i) => { p.i = massiveString; return p; }, {}); const a = [] for (let i = 0; i < 3000; i++) a.push({myMassiveObject, d: Date().now})
set.add
const massiveString = new Array(10000).fill("s").join("") const myMassiveObject = new Array(10000).fill(1).reduce((p, v, i) => { p.i = massiveString; return p; }, {}); const s = new Set() for (let i = 0; i < 3000; i++) s.add({myMassiveObject, d: Date().now})
Rendered benchmark preparation results:
Suite status:
<idle, ready to run>
Run tests (2)
Previous results
Fork
Test case name
Result
array.push
set.add
Fastest:
N/A
Slowest:
N/A
Latest run results:
Run details:
(Test run date:
2 years ago
)
User agent:
Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/122.0.0.0 Safari/537.36
Browser/OS:
Chrome 122 on Linux
View result in a separate tab
Embed
Embed Benchmark Result
Test name
Executions per second
array.push
368.8 Ops/sec
set.add
360.6 Ops/sec
Autogenerated LLM Summary
(model
llama3.2:3b
, generated one year ago):
**What is being tested?** The provided benchmark measures the performance difference between two approaches: `array.push` and `Set.add` when adding massive objects to an array or set, respectively. In essence, the test creates two large arrays or sets with 10,000 elements each and fills them with either strings or numbers. The resulting objects are then added to the respective data structure using `push` or `add`. The benchmark aims to determine which approach is faster for these specific use cases. **Options compared:** Two main options are being compared: 1. **array.push**: A method that adds an element to the end of an array. 2. **Set.add**: A method that adds a single element to a set data structure. **Pros and Cons of each approach:** * **Array.push**: + Pros: Simple, straightforward, and widely supported in most JavaScript environments. + Cons: Can lead to increased memory allocation and garbage collection overhead when dealing with large datasets, as each push operation creates a new array object on the heap. This can result in slower performance due to the additional memory allocations and garbage collections. * **Set.add**: + Pros: More efficient for adding unique elements to a set, as it uses a hash table internally to store its contents. This approach is generally faster than `array.push` when dealing with large datasets, as it avoids the overhead of creating new array objects on each addition. + Cons: May not be as intuitive or widely supported in older JavaScript environments. **Library used:** * None explicitly mentioned in the provided benchmark definition. However, both `Array.prototype.push()` and `Set.prototype.add()` are built-in methods in JavaScript. **Special JS feature/syntax:** None explicitly mentioned in the provided benchmark definition. **Benchmark preparation code explanation:** The test creation involves creating two large arrays or sets with 10,000 elements each: * `const massiveString = new Array(10000).fill("s").join("\");` creates a string of length 50,000 by concatenating 10,000 "s" characters. * `const myMassiveObject = new Array(10000).fill(1).reduce((p, v, i) => { p.i = massiveString; return p; }, {});` creates an object with a property `i` set to the previously created string using the reduce method. * The test then loops 3,000 times, pushing or adding each of these objects to the respective array or set. **Other alternatives:** If you need to compare other approaches for adding massive objects, consider: 1. **Using a linked list**: Instead of arrays or sets, use a linked list data structure with `push` and `add` operations. 2. **Utilizing a custom container class**: Create a custom class that provides an efficient implementation for adding elements, such as using a hash table or a balanced binary search tree. When optimizing performance-critical code, it's essential to consider the trade-offs between different approaches, such as memory allocation, garbage collection, and data structure efficiency.
Related benchmarks:
JavaScript spread operator vs Object.assign performance without mutating original object
JavaScript spread operator vs Object.assign performance with {} target obj
JavaScript spread operator vs Object.assign performance (create new objects)
JavaScript spread operator vs Object.assign performance without overwriting original object
JavaScript spread operator vs Object.assign performance with new object in assign
Comments
Confirm delete:
Do you really want to delete benchmark?