Toggle navigation
MeasureThat.net
Create a benchmark
Tools
Feedback
FAQ
Register
Log In
Run results for:
uint8array vs dataview extract
Go to the benchmark
Embed
Embed Benchmark Result
Run details:
User agent:
Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/131.0.0.0 Safari/537.36
Browser:
Chrome 131
Operating system:
Mac OS X 10.15.7
Device Platform:
Desktop
Date tested:
one year ago
Test name
Executions per second
dataview
136250.6 Ops/sec
uint8array
148085.9 Ops/sec
Script Preparation code:
// Creating an ArrayBuffer with the same data: [32-bit int, 16-bit int, 32-bit int] // Using the same values for comparison: 1234567890, 12345, and 987654321 const buffer = new ArrayBuffer(10); // 4 bytes for int32, 2 bytes for int16, 4 bytes for another int32 const dataView = new DataView(buffer); // Populating the buffer with our example data in little-endian format dataView.setInt32(0, 1234567890, true); // 32-bit int at offset 0 dataView.setInt16(4, 12345, true); // 16-bit int at offset 4 dataView.setInt32(6, 987654321, true); // 32-bit int at offset 6 // Example buffer containing [32-bit int, 16-bit int, 32-bit int] // For demonstration, let's create a buffer that represents the integers 1234567890 (0x499602D2), 12345 (0x3039), and 987654321 (0x3ADE68B1) in little-endian format const bytes = new Uint8Array([ 0xD2, 0x02, 0x96, 0x49, // 1234567890 in little-endian 0x39, 0x30, // 12345 in little-endian 0xB1, 0x68, 0xDE, 0x3A // 987654321 in little-endian ]);
Tests:
dataview
// Reading the integers using DataView const int32_1 = dataView.getInt32(0, true); // First 32-bit integer, little-endian const int16 = dataView.getInt16(4, true); // 16-bit integer, little-endian const int32_2 = dataView.getInt32(6, true); // Second 32-bit integer, little-endian console.log(int32_1); // Outputs: 1234567890 console.log(int16); // Outputs: 12345 console.log(int32_2); // Outputs: 987654321
uint8array
// Function to read a 32-bit integer from a byte offset, assuming little-endian function readInt32LE(bytes, offset) { return (bytes[offset] | (bytes[offset + 1] << 8) | (bytes[offset + 2] << 16) | (bytes[offset + 3] << 24)) >>> 0; } // Function to read a 16-bit integer from a byte offset, assuming little-endian function readInt16LE(bytes, offset) { return bytes[offset] | (bytes[offset + 1] << 8); } // Extracting the integers const int32_1 = (bytes[0] | (bytes[0 + 1] << 8) | (bytes[0 + 2] << 16) | (bytes[0 + 3] << 24)) >>> 0; // First 32-bit integer const int16 = bytes[4] | (bytes[4 + 1] << 8); // 16-bit integer const int32_2 = (bytes[6] | (bytes[6 + 1] << 8) | (bytes[6 + 2] << 16) | (bytes[6 + 3] << 24)) >>> 0; // Second 32-bit integer console.log(int32_1); // Outputs: 1234567890 console.log(int16); // Outputs: 12345 console.log(int32_2); // Outputs: 987654321