Skip to content

Add CachingLocator. #14020

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
39 commits
Select commit Hold shift + click to select a range
2b02bfc
Add a basic implementation of CachingLocator.
ericsnowcurrently Sep 21, 2020
8fa4647
Add a noop envs cache implementation.
ericsnowcurrently Sep 23, 2020
f7efae8
Use CachingLocator.
ericsnowcurrently Sep 21, 2020
33b63ba
Adjust PythonEnvInfoCache for use as a testing fake.
ericsnowcurrently Oct 5, 2020
8220cea
Replace usage of EmptyCache with PythonEnvInfoCache.
ericsnowcurrently Oct 5, 2020
7350eb2
Make CachingLocator.initialize() idempotent.
ericsnowcurrently Sep 29, 2020
c328bd6
Move the onChanged hook to initialize().
ericsnowcurrently Sep 29, 2020
0a442e3
Pass the change event through to refresh().
ericsnowcurrently Sep 29, 2020
8642594
Do not inherit from PythonEnvsWatcher.
ericsnowcurrently Sep 29, 2020
e778d73
Add CachingLocator.dispose().
ericsnowcurrently Sep 29, 2020
0c6925a
Factor out initialRefresh().
ericsnowcurrently Sep 29, 2020
a973035
Add BackgroundLooper.
ericsnowcurrently Sep 30, 2020
b87ec6b
Only run a single refresh operation at a time.
ericsnowcurrently Sep 30, 2020
bf39d81
Simplify BackgroundLooper.
ericsnowcurrently Sep 30, 2020
5ad921d
Add support for retries.
ericsnowcurrently Sep 30, 2020
4a22fec
Eliminate the queue stability issues with getID().
ericsnowcurrently Sep 30, 2020
8a561e0
Fix the logic of the run loop.
ericsnowcurrently Sep 30, 2020
6038238
Periodically refresh the cache.
ericsnowcurrently Sep 30, 2020
72becf8
Rely on onChanged to know if the cache is stale.
ericsnowcurrently Sep 30, 2020
764482a
Add BackgroundLooper.getNextRequest().
ericsnowcurrently Sep 30, 2020
1a71c49
Factor out CachingLocator.iterFromDownstream().
ericsnowcurrently Sep 30, 2020
99af470
Fix the tslint rules.
ericsnowcurrently Sep 30, 2020
e31ee15
Fix callbacks in SimpleLocator.
ericsnowcurrently Oct 5, 2020
d73de35
Fix a typo.
ericsnowcurrently Oct 5, 2020
c26b56c
Use a syntactic shortcut.
ericsnowcurrently Oct 5, 2020
e756686
Add doc comments on internal methods.
ericsnowcurrently Oct 5, 2020
316dd91
Make the different refresh scenarios a bit easier to follow.
ericsnowcurrently Oct 5, 2020
c1ee05b
Add doc comments for BackgroundLooper.
ericsnowcurrently Oct 5, 2020
d666874
Clarify a comment about popping the next request off the queue.
ericsnowcurrently Oct 5, 2020
b091801
Drop the retry/periodic code.
ericsnowcurrently Sep 30, 2020
0c90c6f
Fix typos.
ericsnowcurrently Oct 7, 2020
6bc3b00
Drop unnecessary eslint directives.
ericsnowcurrently Oct 7, 2020
89b6e5e
Move BackgroundRequestLooper to its own file.
ericsnowcurrently Oct 7, 2020
5fd0141
Drop a dead comment.
ericsnowcurrently Oct 7, 2020
68a618f
downstream -> wrapped
ericsnowcurrently Oct 7, 2020
adacb7c
refresh() -> addRefreshRequest()
ericsnowcurrently Oct 7, 2020
e8914f6
Clarify a potentially confusing situation in getGlobalPersistentStore().
ericsnowcurrently Oct 7, 2020
134915b
Use a more concise syntax.
ericsnowcurrently Oct 7, 2020
fa63318
lint
ericsnowcurrently Oct 8, 2020
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
247 changes: 247 additions & 0 deletions src/client/common/utils/backgroundLoop.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,247 @@
// Copyright (c) Microsoft Corporation. All rights reserved.
// Licensed under the MIT License.

import { createDeferred } from './async';

type RequestID = number;
type RunFunc = () => Promise<void>;
type NotifyFunc = () => void;

/**
* This helps avoid running duplicate expensive operations.
*
* The key aspect is that already running or queue requests can be
* re-used instead of creating a duplicate request.
*/
export class BackgroundRequestLooper {
private readonly opts: {
runDefault: RunFunc;
};

private started = false;

private stopped = false;

private readonly done = createDeferred<void>();

private readonly loopRunning = createDeferred<void>();

private waitUntilReady = createDeferred<void>();

private running: RequestID | undefined;

// For now we don't worry about a max queue size.
private readonly queue: RequestID[] = [];

private readonly requests: Record<RequestID, [RunFunc, Promise<void>, NotifyFunc]> = {};

private lastID: number | undefined;

constructor(
opts: {
runDefault?: RunFunc | null;
} = {}
) {
this.opts = {
runDefault:
opts.runDefault ??
(async () => {
throw Error('no default operation provided');
})
};
}

/**
* Start the request execution loop.
*
* Currently it does not support being re-started.
*/
public start(): void {
if (this.stopped) {
throw Error('already stopped');
}
if (this.started) {
return;
}
this.started = true;

this.runLoop().ignoreErrors();
}

/**
* Stop the loop (assuming it was already started.)
*
* @returns - a promise that resolves once the loop has stopped.
*/
public stop(): Promise<void> {
if (this.stopped) {
return this.loopRunning.promise;
}
if (!this.started) {
throw Error('not started yet');
}
this.stopped = true;

this.done.resolve();

// It is conceivable that a separate "waitUntilStopped"
// operation would be useful. If it turned out to be desirable
// then at the point we could add such a method separately.
// It would do nothing more than `await this.loopRunning`.
// Currently there is no need for a separate method since
// returning the promise here is sufficient.
return this.loopRunning.promise;
}

/**
* Return the most recent active request, if any.
*
* If there are no pending requests then this is the currently
* running one (if one is running).
*
* @returns - the ID of the request and its completion promise;
* if there are no active requests then you get `undefined`
*/
public getLastRequest(): [RequestID, Promise<void>] | undefined {
let reqID: RequestID;
if (this.queue.length > 0) {
reqID = this.queue[this.queue.length - 1];
} else if (this.running !== undefined) {
reqID = this.running;
} else {
return undefined;
}
// The req cannot be undefined since every queued ID has a request.
const [, promise] = this.requests[reqID];
if (reqID === undefined) {
// The queue must be empty.
return undefined;
}
return [reqID, promise];
}

/**
* Return the request that is waiting to run next, if any.
*
* The request is the next one that will be run. This implies that
* there is one already running.
*
* @returns - the ID of the request and its completion promise;
* if there are no pending requests then you get `undefined`
*/
public getNextRequest(): [RequestID, Promise<void>] | undefined {
if (this.queue.length === 0) {
return undefined;
}
const reqID = this.queue[0];
// The req cannot be undefined since every queued ID has a request.
const [, promise] = this.requests[reqID]!;
return [reqID, promise];
}

/**
* Request that a function be run.
*
* If one is already running then the new request is added to the
* end of the queue. Otherwise it is run immediately.
*
* @returns - the ID of the new request and its completion promise;
* the promise resolves once the request has completed
*/
public addRequest(run?: RunFunc): [RequestID, Promise<void>] {
const reqID = this.getNextID();
// This is the only method that adds requests to the queue
// and `getNextID()` keeps us from having collisions here.
// So we are guaranteed that there are no matching requests
// in the queue.
const running = createDeferred<void>();
this.requests[reqID] = [
// [RunFunc, "done" promise, NotifyFunc]
run ?? this.opts.runDefault,
running.promise,
() => running.resolve()
];
this.queue.push(reqID);
if (this.queue.length === 1) {
// `waitUntilReady` will get replaced with a new deferred
// in the loop once the existing one gets used.
// We let the queue clear out before triggering the loop
// again.
this.waitUntilReady.resolve();
}
return [reqID, running.promise];
}

/**
* This is the actual loop where the queue is managed and waiting happens.
*/
private async runLoop(): Promise<void> {
const getWinner = () => {
const promises = [
// These are the competing operations.
// Note that the losers keep running in the background.
this.done.promise.then(() => 0),
this.waitUntilReady.promise.then(() => 1)
];
return Promise.race(promises);
};

let winner = await getWinner();
while (!this.done.completed) {
if (winner === 1) {
this.waitUntilReady = createDeferred<void>();
await this.flush();
} else {
// This should not be reachable.
throw Error(`unsupported winner ${winner}`);
}
winner = await getWinner();
}
this.loopRunning.resolve();
}

/**
* Run all pending requests, in queue order.
*
* Each request's completion promise resolves once that request
* finishes.
*/
private async flush(): Promise<void> {
if (this.running !== undefined) {
// We must be flushing the queue already.
return;
}
// Run every request in the queue.
while (this.queue.length > 0) {
const reqID = this.queue[0];
this.running = reqID;
// We pop the request off the queue here so it doesn't show
// up as both running and pending.
this.queue.shift();
const [run, , notify] = this.requests[reqID];

await run();

// We leave the request until right before `notify()`
// for the sake of any calls to `getLastRequest()`.
delete this.requests[reqID];
notify();
}
this.running = undefined;
}

/**
* Provide the request ID to use next.
*/
private getNextID(): RequestID {
// For now there is no way to queue up a request with
// an ID that did not originate here. So we don't need
// to worry about collisions.
if (this.lastID === undefined) {
this.lastID = 1;
} else {
this.lastID += 1;
}
return this.lastID;
}
}
41 changes: 24 additions & 17 deletions src/client/pythonEnvironments/base/envsCache.ts
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,6 @@
// Licensed under the MIT License.

import { cloneDeep } from 'lodash';
import { getGlobalPersistentStore, IPersistentStore } from '../common/externalDependencies';
import { PythonEnvInfo } from './info';
import { areSameEnv } from './info/env';

Expand All @@ -13,7 +12,7 @@ export interface IEnvsCache {
/**
* Initialization logic to be done outside of the constructor, for example reading from persistent storage.
*/
initialize(): void;
initialize(): Promise<void>;

/**
* Return all environment info currently in memory for this session.
Expand All @@ -30,7 +29,7 @@ export interface IEnvsCache {
setAllEnvs(envs: PythonEnvInfo[]): void;

/**
* If the cache has been initialized, return environmnent info objects that match a query object.
* If the cache has been initialized, return environment info objects that match a query object.
* If none of the environments in the cache match the query data, return an empty array.
* If the in-memory cache has not been initialized prior to calling `filterEnvs`, return `undefined`.
*
Expand All @@ -40,14 +39,19 @@ export interface IEnvsCache {
* @return The environment info objects matching the `env` param,
* or `undefined` if the in-memory cache is not initialized.
*/
filterEnvs(env: PythonEnvInfo | string): PythonEnvInfo[] | undefined;
filterEnvs(query: Partial<PythonEnvInfo>): PythonEnvInfo[] | undefined;

/**
* Writes the content of the in-memory cache to persistent storage.
*/
flush(): Promise<void>;
}

export interface IPersistentStorage {
load(): Promise<PythonEnvInfo[] | undefined>;
store(envs: PythonEnvInfo[]): Promise<void>;
}

type CompleteEnvInfoFunction = (envInfo: PythonEnvInfo) => boolean;

/**
Expand All @@ -58,18 +62,23 @@ export class PythonEnvInfoCache implements IEnvsCache {

private envsList: PythonEnvInfo[] | undefined;

private persistentStorage: IPersistentStore<PythonEnvInfo[]> | undefined;
private persistentStorage: IPersistentStorage | undefined;

constructor(private readonly isComplete: CompleteEnvInfoFunction) {}
constructor(
private readonly isComplete: CompleteEnvInfoFunction,
private readonly getPersistentStorage?: () => IPersistentStorage,
) {}

public initialize(): void {
public async initialize(): Promise<void> {
if (this.initialized) {
return;
}

this.initialized = true;
this.persistentStorage = getGlobalPersistentStore<PythonEnvInfo[]>('PYTHON_ENV_INFO_CACHE');
this.envsList = this.persistentStorage?.get();
if (this.getPersistentStorage !== undefined) {
this.persistentStorage = this.getPersistentStorage();
this.envsList = await this.persistentStorage.load();
}
}

public getAllEnvs(): PythonEnvInfo[] | undefined {
Expand All @@ -80,21 +89,19 @@ export class PythonEnvInfoCache implements IEnvsCache {
this.envsList = cloneDeep(envs);
}

public filterEnvs(env: PythonEnvInfo | string): PythonEnvInfo[] | undefined {
const result = this.envsList?.filter((info) => areSameEnv(info, env));

if (result) {
return cloneDeep(result);
public filterEnvs(query: Partial<PythonEnvInfo>): PythonEnvInfo[] | undefined {
if (this.envsList === undefined) {
return undefined;
}

return undefined;
const result = this.envsList.filter((info) => areSameEnv(info, query));
return cloneDeep(result);
}

public async flush(): Promise<void> {
const completeEnvs = this.envsList?.filter(this.isComplete);

if (completeEnvs?.length) {
await this.persistentStorage?.set(completeEnvs);
await this.persistentStorage?.store(completeEnvs);
}
}
}
Loading