Skip to content
This repository was archived by the owner on Dec 5, 2019. It is now read-only.

refactor: replace cacache with flat-cache #121

Closed
wants to merge 1 commit into from
Closed

refactor: replace cacache with flat-cache #121

wants to merge 1 commit into from

Conversation

filipesilva
Copy link
Contributor

@filipesilva filipesilva commented Sep 6, 2017

@joshwiens
Copy link
Member

joshwiens commented Oct 20, 2017

@evilebottnawi @michael-ciniawsky - As soon as we land #108, this one is up next to be included in the initial 1.0.0 release. The @angular/cli group needs this for a perf issue.

@filipesilva
Copy link
Contributor Author

@d3viant0ne I'll need to update this to work on top of #108.

@michael-ciniawsky
Copy link
Member

Can someone or I land #108, so we can triage and fix this then. Did someone try/ask for a LICENSE change @cacache repo?

@joshwiens
Copy link
Member

joshwiens commented Oct 20, 2017

@michael-ciniawsky - The snapshots in #108 need to be updated before that merges. Outside of that, i'm pretty sure it's ready to go.

RE License change: That gets messy & has lingering legal issues not to mention we would probably have to engage the JSF legal team which I would rather avoid.

We shouldn't be using CC0 libs either for pretty much the same reason as Angular.

@michael-ciniawsky
Copy link
Member

#108 landed 🎉

@michael-ciniawsky
Copy link
Member

Are there any alternatives available ? flat-cache uses sync I/O

@michael-ciniawsky michael-ciniawsky added this to the 1.0.0 milestone Oct 21, 2017
@alexander-akait
Copy link
Member

@filipesilva please rebase

@filipesilva filipesilva changed the title [WIP] refactor: replace cacache with flat-cache refactor: replace cacache with flat-cache Oct 22, 2017
@filipesilva
Copy link
Contributor Author

@michael-ciniawsky there's a couple of alternatives, I just searched for cache in npm and went through the list. My criteria was that it looked well supported, reliable, and had a similar API/feature-set. flat-cache seemed to fit the bill. Do you have any other in mind?

@evilebottnawi done.

@filipesilva
Copy link
Contributor Author

I'm looking at the snapshots and there seems to be a problem, still fixing.

@filipesilva
Copy link
Contributor Author

Seems good now 👍

Copy link
Member

@michael-ciniawsky michael-ciniawsky left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can't we wait until the relicensing is done ?
Do we have any basic benchmark data comparing flat-cache to cacache ? I'm worried about flat-cache using sync File I/O


const cacheId = 'uglifyjs-webpack-plugin';

export default class Cache {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why do we need a class here ?

const cache = {
  get (dir, key) {},
  put (dir, key, data) {},
  list (dir) {},
  clear (dir) {}
}

export default cache

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We don't necessarily need a class, no. Something with a similar api was needed for the Jest spies but that's it. I'll change it to an object if it's preferable.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Bear in mind that changing it from Cache to cache necessitated a few other renames to prevent shadowed variables.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Loading the ?whole cache flatCache.load(id, dir) on every method call seems odd or I'm missing something ?

import findCacheDir from 'find-cache-dir';
import workerFarm from 'worker-farm';
import minify from './minify';
import Cache from './cache';
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Cache => cache

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done

@@ -0,0 +1,23 @@
import flatCache from 'flat-cache';

const cacheId = 'uglifyjs-webpack-plugin';
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

cacheId => id

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done


const id = 'uglifyjs-webpack-plugin';

const cache = {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

export default class Cache {
  constructor (dir) {
    this.id = 'uglifyjs-webpack-plugin'
    this.cache = flatCache.load(this.id, dir)
  }

  get (key) {
    return this.cache.getKey(key)
  },

  put (key, data) {
    this.cache.setKey(key, JSON.stringify(data))
    this.cache.save(true)
  },

  list () {
    return this.cache.all()
  },

  clear (dir) {
    flatCache.clearAll(dir)
  }
}

Copy link
Member

@michael-ciniawsky michael-ciniawsky Oct 22, 2017

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Alternative

// cache(dir)
export default function (dir) {
  const id = 'uglifyjs-webpack-plugin'
  const cache = flatCache.load(id, dir)

  return {
    get (key) {
      return cache.getKey(key)
    },
    put (key, data) {
      cache.setKey(key, JSON.stringify(data))
      cache.save(true)
    },
    list () {
      return cache.all()
    },
    clear (dir) {
      flatCache.clearAll(dir)
    }
  }
}

import findCacheDir from 'find-cache-dir';
import workerFarm from 'worker-farm';
import minify from './minify';
import cache from './cache';
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

cache => Cache

const { cache, parallel } = options;
this.cacheDir = cache === true ? findCacheDir({ name: 'uglifyjs-webpack-plugin' }) : cache;
const { cache: useCache, parallel } = options;
this.cacheDir = useCache === true ? findCacheDir({ name: 'uglifyjs-webpack-plugin' }) : useCache;
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this.cacheDir = cache === true ? findCacheDir({ name: 'uglifyjs-webpack-plugin' }) : cache;
this.cache = new Cache(this.cacheDir);

@@ -58,15 +58,21 @@ export default class {
const done = () => step(index, result);

if (this.cacheDir && !result.error) {
cacache.put(this.cacheDir, task.cacheKey, JSON.stringify(data)).then(done, done);
cache.put(this.cacheDir, task.cacheKey, JSON.stringify(data));
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this.cache.put(task.cacheKey, JSON.stringify(data))

@filipesilva
Copy link
Contributor Author

@michael-ciniawsky the approach that you're suggesting (using a instanced class) will not work with the tests as they exist currently, since they rely on being able to spy on static cache methods. It would also require the cache being exposed cache instance being exposed so the tests could tap into it. Is this something you want changed?

Blocked by #108.

See #120 for reasoning.

Fix #120.
@hansl
Copy link

hansl commented Oct 23, 2017

@michael-ciniawsky the process of relicensing cacache would likely take weeks, and we've (Angular CLI's team) been waiting since July for this release to have a proper license. The current adaptor would allow us to change back to cacache when they do support an open source license, without any changes to the code.

@joshwiens
Copy link
Member

FYI - In it's current state this throws when trying to access cached build assets.

@filipesilva
Copy link
Contributor Author

@d3viant0ne did you pull the new changes? I tried a rebuild some 10m ago and rebuilds were good. I amended the previous commit with the fix some 20m ago.

@zkat
Copy link
Contributor

zkat commented Oct 23, 2017

Heads-up: I'm working on relicensing cacache, if it makes any difference here. Should be ready soonish, since folks are 👍 pretty quick and there aren't a lot of contributors rn.

Edit: I took a look at flat-cache and I suggest y'all find something different. I don't think that library's gonna hold up at all under concurrency conditions. The most likely scenario is you'll have an eternally-invalidated cache, since you're going to have a race that looks like:

worker 1                               worker 2
 |                                      | - truncate cache file
 |                                      | - write chunk 1
 | - truncate cache file                |
 |                                      | - write chunk 2
 | - write chunk 1                      |
 |                                      | -  close cache file                          
 |
 | - write chunk 2
 | _ close cache file

meaning:

"foo": 1
"bar": 2
} // end of worker 2, chunk 2
"a": 1, // worker 1, chunk 1
"b": 2,
"c": 3
} // end of worker 1, chunk 1

And as soon as flat-cache reloads that file, it's just going to throw it all away

@zkat
Copy link
Contributor

zkat commented Oct 23, 2017

@hansl fwiw, no one told me about this until a couple of days ago, and I needed an excuse to start the relicense process, since it needed to happen anyway.

@michael-ciniawsky
Copy link
Member

Fixed by #145

@filipesilva
Copy link
Contributor Author

Heya @zkat, I just wanted to thank you for being so speedy in relicensing cacache! That was really fast!

You're analysis of flat-cache under concurrency conditions also makes a lot of sense. That hadn't occurred to me at all :/

@michael-ciniawsky
Copy link
Member

Yep, thanks again @zkat for all the (quick) help here 👍

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants