unified

Project: remark-embedder/cache

Package: @remark-embedder/cache@2.1.0

  1. A cache for @remark-embedder/core

    @remark-embedder/cache

    A cache for @remark-embedder/core


    Build Status Code Coverage version downloads MIT License All Contributors PRs Welcome Code of Conduct

    The problem

    You're using @remark-embedder/core and you want to cache the results of your transformers long-term so you don't have to make network requests for HTML every time.

    This solution

    This is a cache implementation specifically for @remark-embedder/core that saves the results of getHTML for a transformer to disk (in node_modules/.cache by default).

    Table of Contents

    Installation

    This module is distributed via npm which is bundled with node and should be installed as one of your project's dependencies:

    npm install @remark-embedder/cache
    

    Usage

    import Cache from '@remark-embedder/cache'
    
    const cache = new Cache()
    
    async function go() {
      const result = await remark()
        .use(remarkEmbedder, {
          cache,
          transformers: [
            // transformers
          ],
        })
        .use(html)
        .process(someMarkdown)
    }
    
    go().then(go).then(go).then(go)
    
    // your transformers will only be called once even though we call process 4 times.
    

    The default directory is pretty reasonable: path.join(process.cwd(), 'node_modules/.cache/@remark-embedder/cache'), but if you want to change it, that's the first argument of the Cache constructor: new Cache(directory).

    Inspiration

    Other Solutions

    I'm not aware of any, if you are please make a pull request and add it here!

    Issues

    Looking to contribute? Look for the Good First Issue label.

    πŸ› Bugs

    Please file an issue for bugs, missing documentation, or unexpected behavior.

    See Bugs

    πŸ’‘ Feature Requests

    Please file an issue to suggest new features. Vote on feature requests by adding a πŸ‘. This helps maintainers prioritize what to work on.

    See Feature Requests

    Contributors ✨

    Thanks goes to these people (emoji key):


    Kent C. Dodds

    πŸ’» πŸ“– πŸš‡ ⚠️

    MichaΓ«l De Boey

    πŸ“– πŸ’» 🚧

    Andreas Houben

    πŸ“–

    This project follows the all-contributors specification. Contributions of any kind welcome!

    LICENSE

    MIT