Toggle navigation
MeasureThat.net
Create a benchmark
Tools
Feedback
FAQ
Register
Log In
Markdown Performance Comparison 101
(version: 0)
A performance comparison of leading Javascript Markdown implementations.
Comparing performance of:
Marked vs CommonMark vs Markdown-it vs Remarkable vs Micromarkdown vs Snarkdown vs markdown-wasm
Created:
4 years ago
by:
Guest
Jump to the latest result
HTML Preparation code:
<script type="text/javascript" src="https://unpkg.com/marked@1.1.0/marked.min.js"></script> <script type="text/javascript" src="https://unpkg.com/commonmark@0.29.1/dist/commonmark.min.js"></script> <script type="text/javascript" src="https://unpkg.com/markdown-it@11.0.0/dist/markdown-it.min.js"></script> <script type="text/javascript" src="https://unpkg.com/remarkable@2.0.1/dist/remarkable.min.js"></script> <script type="text/javascript" src="https://unpkg.com/micromarkdown@0.3.0/dist/micromarkdown.min.js"></script> <script type="text/javascript" src="https://unpkg.com/snarkdown@1.2.2/dist/snarkdown.umd.js"></script> <script type="text/javascript" src="https://lukeuser.cacus.feralhosting.com/markdown.js"></script>
Script Preparation code:
var conv = { marked: marked, cm: new commonmark.HtmlRenderer(), mdit: markdownit({ html: true }), remarkable: new remarkable.Remarkable(), micromarkdown: window.micromarkdown, snarkdown: window.snarkdown, markdown: window.markdown }; var pars = { cm: new commonmark.Parser() }; function md() { return "## Can't we simplify? \n\nBecause running websites and managing databases isn't always worth the effort, \"Static Content Generators\" like [Dr. jekyll](http://jekyllrb.com/) and [Mr. Hyde](http://hyde.github.io/) have begun to gain traction as people pre-build websites and then place the static HTML results onto their server. \n\nThis means the web server doesn't need to be configured with any special software or databases - it simply serves the generated static HTML files.\n\nHowever, you still have to install the generator and setup a \"build\" process of sorts after every article.\n\n## Meet Jr.\n\n`Jr` is a truly *static*, static content generator. All the processing of your files happens on the requesting client's computer as needed. The whole system is written in client-side JavaScript. This means:\n\n- minimal bandwidth requirements\n- better search engine indexing\n- awesome screen reader support\n- *zero* security vulnerabilities\n- and more!\n\nHowever, the neatest thing about `Jr` is that you don't have to configure, setup, or install _anything_! Simply download the files, create your articles, and upload everything to your server!\n\ndone."; }
Tests:
Marked
conv.marked(md());
CommonMark
conv.cm.render(pars.cm.parse(md()));
Markdown-it
conv.mdit.render(md());
Remarkable
conv.remarkable.render(md());
Micromarkdown
conv.micromarkdown.parse(md(),true);
Snarkdown
conv.snarkdown(md());
markdown-wasm
try {conv.markdown.parse(md());} catch {return null};
Rendered benchmark preparation results:
Suite status:
<idle, ready to run>
Run tests (7)
Previous results
Fork
Test case name
Result
Marked
CommonMark
Markdown-it
Remarkable
Micromarkdown
Snarkdown
markdown-wasm
Fastest:
N/A
Slowest:
N/A
Latest run results:
No previous run results
This benchmark does not have any results yet. Be the first one
to run it!
Autogenerated LLM Summary
(model
llama3.2:3b
, generated one year ago):
I'll provide an in-depth explanation of the benchmark and its components. **Benchmark Overview** The provided JSON represents a performance comparison benchmark between six Markdown parsing libraries: Marked, CommonMark, Markdown-it, Remarkable, Micromarkdown, and Snarkdown. The benchmark aims to measure which library is the fastest at rendering Markdown text. **Preparation Code** The preparation code sets up an object `conv` that contains instances of each library, along with a test string `md()`. This string contains a sample Markdown text with headings, bold text, and links. **Individual Test Cases** Each test case has a unique definition: 1. **Marked**: Calls the `marked()` function on the `conv.marked` instance. 2. **CommonMark**: Calls the `render()` method on the result of parsing the `md()` string using the `pars.cm` instance (a CommonMark parser). 3. **Markdown-it**: Calls the `render()` method on the result of parsing the `md()` string using the `conv.mdit` instance (a Markdown-it instance). 4. **Remarkable**: Calls the `render()` method on the result of parsing the `md()` string using the `conv.remarkable` instance (a Remarkable instance). 5. **Micromarkdown**: Calls the `execute()` function on the `conv.micromarkdown` instance, passing the `md()` string as an argument. 6. **markdown-wasm**: Uses a web assembly wrapper to execute the `markdown-wasm` library on the test string. **Benchmark Results** The latest benchmark results show the number of executions per second for each library, measured using the `ExecutionsPerSecond` metric. The results indicate that: * markdown-wasm is the fastest at 152193.234375 executions per second. * Micromarkdown is the next fastest at 43178.53125 executions per second. The other libraries are slower, with Remarkable and Snarkdown performing similarly (around 30000-40000 executions per second). The web assembly wrapper has a significant impact on performance, as evidenced by markdown-wasm's lead. **Key Takeaways** * markdown-wasm is the fastest Markdown parsing library in this benchmark. * Micromarkdown is the next most performant option. * Remarkable and Snarkdown have comparable performance (around 30000-40000 executions per second). * The web assembly wrapper has a significant impact on performance, making markdown-wasm the winner. Overall, this benchmark provides valuable insights into the relative performance of different Markdown parsing libraries.
Related benchmarks:
Markdown Performance Comparison 100
Markdown Performance Comparison 300
Markdown Performance Comparison 1000
Markdown Performance Comparison 1001
Markdown Performance Comparison Redux
Comments
Confirm delete:
Do you really want to delete benchmark?