forked from DonJayamanne/pythonVSCode
-
Notifications
You must be signed in to change notification settings - Fork 1.2k
Add CachingLocator. #14020
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Merged
ericsnowcurrently
merged 39 commits into
microsoft:main
from
ericsnowcurrently:pyenvs-component-caching-locator
Oct 8, 2020
Merged
Add CachingLocator. #14020
Changes from 37 commits
Commits
Show all changes
39 commits
Select commit
Hold shift + click to select a range
2b02bfc
Add a basic implementation of CachingLocator.
ericsnowcurrently 8fa4647
Add a noop envs cache implementation.
ericsnowcurrently f7efae8
Use CachingLocator.
ericsnowcurrently 33b63ba
Adjust PythonEnvInfoCache for use as a testing fake.
ericsnowcurrently 8220cea
Replace usage of EmptyCache with PythonEnvInfoCache.
ericsnowcurrently 7350eb2
Make CachingLocator.initialize() idempotent.
ericsnowcurrently c328bd6
Move the onChanged hook to initialize().
ericsnowcurrently 0a442e3
Pass the change event through to refresh().
ericsnowcurrently 8642594
Do not inherit from PythonEnvsWatcher.
ericsnowcurrently e778d73
Add CachingLocator.dispose().
ericsnowcurrently 0c6925a
Factor out initialRefresh().
ericsnowcurrently a973035
Add BackgroundLooper.
ericsnowcurrently b87ec6b
Only run a single refresh operation at a time.
ericsnowcurrently bf39d81
Simplify BackgroundLooper.
ericsnowcurrently 5ad921d
Add support for retries.
ericsnowcurrently 4a22fec
Eliminate the queue stability issues with getID().
ericsnowcurrently 8a561e0
Fix the logic of the run loop.
ericsnowcurrently 6038238
Periodically refresh the cache.
ericsnowcurrently 72becf8
Rely on onChanged to know if the cache is stale.
ericsnowcurrently 764482a
Add BackgroundLooper.getNextRequest().
ericsnowcurrently 1a71c49
Factor out CachingLocator.iterFromDownstream().
ericsnowcurrently 99af470
Fix the tslint rules.
ericsnowcurrently e31ee15
Fix callbacks in SimpleLocator.
ericsnowcurrently d73de35
Fix a typo.
ericsnowcurrently c26b56c
Use a syntactic shortcut.
ericsnowcurrently e756686
Add doc comments on internal methods.
ericsnowcurrently 316dd91
Make the different refresh scenarios a bit easier to follow.
ericsnowcurrently c1ee05b
Add doc comments for BackgroundLooper.
ericsnowcurrently d666874
Clarify a comment about popping the next request off the queue.
ericsnowcurrently b091801
Drop the retry/periodic code.
ericsnowcurrently 0c90c6f
Fix typos.
ericsnowcurrently 6bc3b00
Drop unnecessary eslint directives.
ericsnowcurrently 89b6e5e
Move BackgroundRequestLooper to its own file.
ericsnowcurrently 5fd0141
Drop a dead comment.
ericsnowcurrently 68a618f
downstream -> wrapped
ericsnowcurrently adacb7c
refresh() -> addRefreshRequest()
ericsnowcurrently e8914f6
Clarify a potentially confusing situation in getGlobalPersistentStore().
ericsnowcurrently 134915b
Use a more concise syntax.
ericsnowcurrently fa63318
lint
ericsnowcurrently File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,247 @@ | ||
// Copyright (c) Microsoft Corporation. All rights reserved. | ||
// Licensed under the MIT License. | ||
|
||
import { createDeferred } from './async'; | ||
|
||
type RequestID = number; | ||
type RunFunc = () => Promise<void>; | ||
type NotifyFunc = () => void; | ||
|
||
/** | ||
* This helps avoid running duplicate expensive operations. | ||
* | ||
* The key aspect is that already running or queue requests can be | ||
* re-used instead of creating a duplicate request. | ||
*/ | ||
export class BackgroundRequestLooper { | ||
private readonly opts: { | ||
runDefault: RunFunc; | ||
}; | ||
|
||
private started = false; | ||
|
||
private stopped = false; | ||
|
||
private readonly done = createDeferred<void>(); | ||
|
||
private readonly loopRunning = createDeferred<void>(); | ||
|
||
private waitUntilReady = createDeferred<void>(); | ||
|
||
private running: RequestID | undefined; | ||
|
||
// For now we don't worry about a max queue size. | ||
private readonly queue: RequestID[] = []; | ||
|
||
private readonly requests: Record<RequestID, [RunFunc, Promise<void>, NotifyFunc]> = {}; | ||
|
||
private lastID: number | undefined; | ||
|
||
constructor( | ||
opts: { | ||
runDefault?: RunFunc | null; | ||
} = {} | ||
) { | ||
this.opts = { | ||
runDefault: opts.runDefault | ||
? opts.runDefault | ||
: async () => { | ||
throw Error('no default operation provided'); | ||
} | ||
}; | ||
} | ||
|
||
/** | ||
* Start the request execution loop. | ||
* | ||
* Currently it does not support being re-started. | ||
*/ | ||
public start(): void { | ||
if (this.stopped) { | ||
throw Error('already stopped'); | ||
} | ||
if (this.started) { | ||
return; | ||
} | ||
this.started = true; | ||
|
||
this.runLoop().ignoreErrors(); | ||
} | ||
|
||
/** | ||
* Stop the loop (assuming it was already started.) | ||
* | ||
* @returns - a promise that resolves once the loop has stopped. | ||
*/ | ||
public stop(): Promise<void> { | ||
if (this.stopped) { | ||
return this.loopRunning.promise; | ||
} | ||
if (!this.started) { | ||
throw Error('not started yet'); | ||
} | ||
this.stopped = true; | ||
|
||
this.done.resolve(); | ||
|
||
// It is conceivable that a separate "waitUntilStopped" | ||
// operation would be useful. If it turned out to be desirable | ||
// then at the point we could add such a method separately. | ||
// It would do nothing more than `await this.loopRunning`. | ||
// Currently there is no need for a separate method since | ||
// returning the promise here is sufficient. | ||
return this.loopRunning.promise; | ||
} | ||
|
||
/** | ||
* Return the most recent active request, if any. | ||
* | ||
* If there are no pending requests then this is the currently | ||
* running one (if one is running). | ||
* | ||
* @returns - the ID of the request and its completion promise; | ||
* if there are no active requests then you get `undefined` | ||
*/ | ||
public getLastRequest(): [RequestID, Promise<void>] | undefined { | ||
let reqID: RequestID; | ||
if (this.queue.length > 0) { | ||
reqID = this.queue[this.queue.length - 1]; | ||
} else if (this.running !== undefined) { | ||
reqID = this.running; | ||
} else { | ||
return undefined; | ||
} | ||
// The req cannot be undefined since every queued ID has a request. | ||
const [, promise] = this.requests[reqID]; | ||
if (reqID === undefined) { | ||
// The queue must be empty. | ||
return undefined; | ||
} | ||
return [reqID, promise]; | ||
} | ||
|
||
/** | ||
* Return the request that is waiting to run next, if any. | ||
* | ||
* The request is the next one that will be run. This implies that | ||
* there is one already running. | ||
* | ||
* @returns - the ID of the request and its completion promise; | ||
* if there are no pending requests then you get `undefined` | ||
*/ | ||
public getNextRequest(): [RequestID, Promise<void>] | undefined { | ||
if (this.queue.length === 0) { | ||
return undefined; | ||
} | ||
const reqID = this.queue[0]; | ||
// The req cannot be undefined since every queued ID has a request. | ||
const [, promise] = this.requests[reqID]!; | ||
return [reqID, promise]; | ||
} | ||
|
||
/** | ||
* Request that a function be run. | ||
* | ||
* If one is already running then the new request is added to the | ||
* end of the queue. Otherwise it is run immediately. | ||
* | ||
* @returns - the ID of the new request and its completion promise; | ||
* the promise resolves once the request has completed | ||
*/ | ||
public addRequest(run?: RunFunc): [RequestID, Promise<void>] { | ||
const reqID = this.getNextID(); | ||
// This is the only method that adds requests to the queue | ||
// and `getNextID()` keeps us from having collisions here. | ||
// So we are guaranteed that there are no matching requests | ||
// in the queue. | ||
const running = createDeferred<void>(); | ||
this.requests[reqID] = [ | ||
// [RunFunc, "done" promise, NotifyFunc] | ||
run ?? this.opts.runDefault, | ||
running.promise, | ||
() => running.resolve() | ||
]; | ||
this.queue.push(reqID); | ||
if (this.queue.length === 1) { | ||
// `waitUntilReady` will get replaced with a new deferred | ||
// in the loop once the existing one gets used. | ||
// We let the queue clear out before triggering the loop | ||
// again. | ||
this.waitUntilReady.resolve(); | ||
} | ||
return [reqID, running.promise]; | ||
} | ||
|
||
/** | ||
* This is the actual loop where the queue is managed and waiting happens. | ||
*/ | ||
private async runLoop(): Promise<void> { | ||
const getWinner = () => { | ||
const promises = [ | ||
// These are the competing operations. | ||
// Note that the losers keep running in the background. | ||
this.done.promise.then(() => 0), | ||
this.waitUntilReady.promise.then(() => 1) | ||
]; | ||
return Promise.race(promises); | ||
}; | ||
|
||
let winner = await getWinner(); | ||
while (!this.done.completed) { | ||
if (winner === 1) { | ||
this.waitUntilReady = createDeferred<void>(); | ||
await this.flush(); | ||
} else { | ||
// This should not be reachable. | ||
throw Error(`unsupported winner ${winner}`); | ||
} | ||
winner = await getWinner(); | ||
} | ||
this.loopRunning.resolve(); | ||
} | ||
|
||
/** | ||
* Run all pending requests, in queue order. | ||
* | ||
* Each request's completion promise resolves once that request | ||
* finishes. | ||
*/ | ||
private async flush(): Promise<void> { | ||
if (this.running !== undefined) { | ||
// We must be flushing the queue already. | ||
return; | ||
} | ||
// Run every request in the queue. | ||
while (this.queue.length > 0) { | ||
const reqID = this.queue[0]; | ||
this.running = reqID; | ||
// We pop the request off the queue here so it doesn't show | ||
// up as both running and pending. | ||
this.queue.shift(); | ||
const [run, , notify] = this.requests[reqID]; | ||
|
||
await run(); | ||
|
||
// We leave the request until right before `notify()` | ||
// for the sake of any calls to `getLastRequest()`. | ||
delete this.requests[reqID]; | ||
notify(); | ||
} | ||
this.running = undefined; | ||
} | ||
|
||
/** | ||
* Provide the request ID to use next. | ||
*/ | ||
private getNextID(): RequestID { | ||
// For now there is no way to queue up a request with | ||
// an ID that did not originate here. So we don't need | ||
// to worry about collisions. | ||
if (this.lastID === undefined) { | ||
this.lastID = 1; | ||
} else { | ||
this.lastID += 1; | ||
} | ||
return this.lastID; | ||
} | ||
} |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Uh oh!
There was an error while loading. Please reload this page.