Skip to content

fix typing issue and add typing test for llamaindexserver templates #613

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 11 commits into from
May 13, 2025

Conversation

leehuwuj
Copy link
Collaborator

@leehuwuj leehuwuj commented May 12, 2025

Summary by CodeRabbit

  • New Features
    • Added new data ingestion scripts for agentic RAG and financial report use cases in TypeScript templates.
  • Bug Fixes
    • Addressed a typing check issue in the Python template.
  • Chores
    • Updated end-to-end test workflows to support multiple template types and use cases.
    • Improved and extended test coverage for dependency resolution in both Python and TypeScript projects.
    • Updated directory paths for template file copying and removed redundant README template copying.
    • Upgraded "@llamaindex/tools" dependency to version 0.0.9.
    • Excluded additional directories from code formatting.
  • Style
    • Suppressed type checking for a specific import in the Python FastAPI template.
  • Enhancements
    • Conditional prompt added to restrict LlamaCloud usage question based on app type.

Copy link

changeset-bot bot commented May 12, 2025

🦋 Changeset detected

Latest commit: 53ea76a

The changes in this PR will be included in the next version bump.

This PR includes changesets to release 1 package
Name Type
create-llama Patch

Not sure what this means? Click here to learn what changesets are.

Click here if you're a maintainer who wants to add another changeset to this PR

Copy link

coderabbitai bot commented May 12, 2025

Walkthrough

This update restructures and extends the end-to-end test suites for Python and TypeScript in the create-llama package to support multiple template types and use cases. It updates helper functions, modifies file paths from "workflows" to "use-cases," introduces new template files, adjusts CI workflow matrices, updates dependency versions, adds new ignore patterns, and refines prompt logic.

Changes

File(s) Change Summary
.changeset/fifty-badgers-jog.md Added a patch changeset describing a fix for a typing check issue.
.github/workflows/e2e.yml Added template-types matrix for end-to-end tests and set TEMPLATE_TYPE environment variable accordingly.
.prettierignore Added ignore patterns for playwright-report and test-results directories.
packages/create-llama/e2e/python/resolve_dependencies.spec.ts
packages/create-llama/e2e/typescript/resolve_dependencies.spec.ts
Refactored and extended test suites to support multiple template types and use cases, introduced environment variable-based skipping, and unified test function interfaces.
packages/create-llama/helpers/python.ts
packages/create-llama/helpers/typescript.ts
Updated file copy paths from "workflows" to "use-cases" for template and UI component installation; removed README copy in TypeScript helper; added conditional simplification for "artifacts" use case.
packages/create-llama/templates/components/use-cases/typescript/agentic_rag/src/generate.ts
packages/create-llama/templates/components/use-cases/typescript/financial_report/src/generate.ts
Added new TypeScript scripts for data ingestion and storage context generation for agentic RAG and financial report use cases.
packages/create-llama/templates/types/llamaindexserver/fastapi/generate.py Suppressed type checking for a specific import by adding # type: ignore.
packages/create-llama/templates/types/llamaindexserver/nextjs/package.json Updated the "@llamaindex/tools" dependency version from "0.0.4" to "0.0.9".
packages/create-llama/questions/simple.ts Made the prompt for using LlamaCloud conditional on the app type, skipping it for "artifacts".

Sequence Diagram(s)

sequenceDiagram
    participant CI as GitHub Actions CI
    participant TestPy as Python E2E Test Suite
    participant TestTS as TypeScript E2E Test Suite
    participant Env as Environment

    CI->>Env: Set TEMPLATE_TYPE from matrix (streaming/llamaindexserver)
    CI->>TestPy: Run Python tests with TEMPLATE_TYPE
    CI->>TestTS: Run TypeScript tests with TEMPLATE_TYPE

    alt TEMPLATE_TYPE == streaming
        TestPy->>TestPy: Iterate over vector DBs, tools, datasources, observability
        TestPy->>TestPy: Create & check project, assert dependencies
        TestTS->>TestTS: Iterate over vector DBs and LlamaParse
        TestTS->>TestTS: Create project, install deps, type check
    else TEMPLATE_TYPE == llamaindexserver
        TestPy->>TestPy: Iterate over use cases
        TestPy->>TestPy: Create & check project (no dep assertion)
        TestTS->>TestTS: Iterate over use cases and LlamaParse
        TestTS->>TestTS: Create project, install deps, type check
    end
Loading
sequenceDiagram
    participant User as User
    participant GenerateTS as generate.ts (agentic_rag/financial_report)
    participant Storage as StorageContext
    participant Reader as SimpleDirectoryReader
    participant Index as VectorStoreIndex

    User->>GenerateTS: Run CLI command (e.g., "datasource")
    GenerateTS->>Storage: Create storage context with persist_dir
    GenerateTS->>Reader: Load documents from "data" directory
    Reader-->>GenerateTS: Return documents
    GenerateTS->>Index: Create vector index from documents and storage context
    Index-->>GenerateTS: Index created and persisted
Loading

Possibly related PRs

Suggested reviewers

  • marcusschiesser

Poem

🐇
Hopping through the use-case maze,
Tests now run in clever arrays—
From "workflows" to "use-cases" we leap,
With matrixed types and coverage deep.
Llama tools now up to date,
Our templates shine—oh, isn't it great?
CI bunnies cheer, their work complete!

Tip

⚡️ Faster reviews with caching
  • CodeRabbit now supports caching for code and dependencies, helping speed up reviews. This means quicker feedback, reduced wait times, and a smoother review experience overall. Cached data is encrypted and stored securely. This feature will be automatically enabled for all accounts on May 16th. To opt out, configure Review - Disable Cache at either the organization or repository level. If you prefer to disable all data retention across your organization, simply turn off the Data Retention setting under your Organization Settings.

Enjoy the performance boost—your workflow just got faster.

✨ Finishing Touches
  • 📝 Generate Docstrings

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share
🪧 Tips

Chat

There are 3 ways to chat with CodeRabbit:

  • Review comments: Directly reply to a review comment made by CodeRabbit. Example:
    • I pushed a fix in commit <commit_id>, please review it.
    • Explain this complex logic.
    • Open a follow-up GitHub issue for this discussion.
  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query. Examples:
    • @coderabbitai explain this code block.
    • @coderabbitai modularize this function.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    • @coderabbitai gather interesting stats about this repository and render them as a table. Additionally, render a pie chart showing the language distribution in the codebase.
    • @coderabbitai read src/utils.ts and explain its main purpose.
    • @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.
    • @coderabbitai help me debug CodeRabbit configuration file.

Support

Need help? Create a ticket on our support page for assistance with any issues or questions.

Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments.

CodeRabbit Commands (Invoked using PR comments)

  • @coderabbitai pause to pause the reviews on a PR.
  • @coderabbitai resume to resume the paused reviews.
  • @coderabbitai review to trigger an incremental review. This is useful when automatic reviews are disabled for the repository.
  • @coderabbitai full review to do a full review from scratch and review all the files again.
  • @coderabbitai summary to regenerate the summary of the PR.
  • @coderabbitai generate docstrings to generate docstrings for this PR.
  • @coderabbitai generate sequence diagram to generate a sequence diagram of the changes in this PR.
  • @coderabbitai resolve resolve all the CodeRabbit review comments.
  • @coderabbitai configuration to show the current CodeRabbit configuration for the repository.
  • @coderabbitai help to get help.

Other keywords and placeholders

  • Add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.
  • Add @coderabbitai summary to generate the high-level summary at a specific location in the PR description.
  • Add @coderabbitai anywhere in the PR title to generate the title automatically.

Documentation and Community

  • Visit our Documentation for detailed information on how to use CodeRabbit.
  • Join our Discord Community to get help, request features, and share feedback.
  • Follow us on X/Twitter for updates and announcements.

@leehuwuj leehuwuj marked this pull request as ready for review May 13, 2025 04:47
@leehuwuj leehuwuj changed the title fix typing issue fix typing issue and add typing test for llamaindexserver templates May 13, 2025
Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 4

🔭 Outside diff range comments (1)
packages/create-llama/e2e/python/resolve_dependencies.spec.ts (1)

192-215: 🛠️ Refactor suggestion

⚠️ Potential issue

async describe is invalid – move the heavy work into proper test() blocks

test.describe must be synchronous and should contain only test definitions.
Running asynchronous I/O (await createTestDir() / createAndCheckLlamaProject) during the declaration phase will:

  1. Make Playwright treat the promise as an unknown test, causing flakiness or silent skips.
  2. Execute resource-hungry commands before the runner’s own timeout/parallelism controls take effect.

Refactor the loop so each useCase is wrapped by an individual test():

-test.describe("LlamaIndexServer", async () => {
-  test.skip(templateType !== "llamaindexserver",
-    `skipping llamaindexserver test for ${templateType}`);
-  for (const useCase of useCases) {
-    const cwd = await createTestDir();
-    await createAndCheckLlamaProject({
-      options: {
-        cwd,
-        templateType: "llamaindexserver",
-        ...
-        useCase,
-      },
-    });
-  }
-});
+test.describe("LlamaIndexServer", () => {
+  test.skip(
+    templateType !== "llamaindexserver",
+    `skipping llamaindexserver test for ${templateType}`,
+  );
+
+  for (const useCase of useCases) {
+    test(`use-case: ${useCase}`, async () => {
+      const cwd = await createTestDir();
+      await createAndCheckLlamaProject({
+        options: {
+          cwd,
+          templateType: "llamaindexserver",
+          ...
+          useCase,
+        },
+      });
+    });
+  }
+});

This keeps test discovery deterministic and ties failures to the right test names.

🧹 Nitpick comments (5)
.changeset/fifty-badgers-jog.md (1)

5-5: Consider elaborating the summary.
The summary "Fix typing check issue" is concise but could benefit from referencing the specific typing validation or issue number (e.g., “Fix typing check for X in create-llama (#613)”) to improve traceability.

packages/create-llama/templates/types/llamaindexserver/fastapi/generate.py (1)

54-54: Avoid suppressing type errors with # type: ignore.
Using # type: ignore can mask underlying typing or import-path issues. Consider updating app/workflow.py to properly export UIEventData or adjusting the generated template’s import path to align with your new use-cases structure, rather than silencing the error.

packages/create-llama/templates/components/use-cases/typescript/financial_report/src/generate.ts (1)

1-40: Code duplication between use cases

This file is identical to the agentic_rag generate.ts. While duplication is sometimes necessary for templates, consider if this could be abstracted into a shared utility in the future.

For now, this implementation looks correct and works as expected for the financial report use case.

packages/create-llama/e2e/python/resolve_dependencies.spec.ts (2)

134-136: Remove unused dataSourceType to appease linters

The temporary variable is computed but never referenced.

-const dataSourceType = dataSource.split(" ")[0];

Deleting it avoids an “unused variable” warning and keeps the test minimal.


27-29: Consider running the whole suite serially to avoid resource contention

Each test spins up uv, installs dependencies, and runs mypy, which is CPU- and network-intensive.
Running all of them in parallel (Playwright’s default) can:

• Exceed CI memory limits
• Cause transient network-related failures (PyPI rate limits, etc.)

If build time allows, add:

-test.describe("Mypy check", () => {
-  test.describe.configure({ retries: 0 });
+test.describe.configure({ mode: "serial", retries: 0 });
+test.describe("Mypy check", () => {

This still keeps nested tests but executes them one at a time, improving stability.

📜 Review details

Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 556f33c and 39a4ee5.

📒 Files selected for processing (11)
  • .changeset/fifty-badgers-jog.md (1 hunks)
  • .github/workflows/e2e.yml (4 hunks)
  • .prettierignore (1 hunks)
  • packages/create-llama/e2e/python/resolve_dependencies.spec.ts (2 hunks)
  • packages/create-llama/e2e/typescript/resolve_dependencies.spec.ts (2 hunks)
  • packages/create-llama/helpers/python.ts (2 hunks)
  • packages/create-llama/helpers/typescript.ts (1 hunks)
  • packages/create-llama/templates/components/use-cases/typescript/agentic_rag/src/generate.ts (1 hunks)
  • packages/create-llama/templates/components/use-cases/typescript/financial_report/src/generate.ts (1 hunks)
  • packages/create-llama/templates/types/llamaindexserver/fastapi/generate.py (1 hunks)
  • packages/create-llama/templates/types/llamaindexserver/nextjs/package.json (1 hunks)
🧰 Additional context used
🧬 Code Graph Analysis (3)
packages/create-llama/helpers/python.ts (1)
packages/create-llama/helpers/copy.ts (1)
  • copy (13-49)
packages/create-llama/helpers/typescript.ts (1)
packages/create-llama/helpers/copy.ts (2)
  • copy (13-49)
  • assetRelocator (51-70)
packages/create-llama/e2e/python/resolve_dependencies.spec.ts (2)
packages/create-llama/helpers/types.ts (4)
  • TemplateFramework (29-29)
  • TemplateType (22-28)
  • TemplateUseCase (53-61)
  • TemplateVectorDB (31-41)
packages/create-llama/e2e/utils.ts (3)
  • createTestDir (160-164)
  • RunCreateLlamaOptions (21-37)
  • runCreateLlama (39-158)
⏰ Context from checks skipped due to timeout of 90000ms (48)
  • GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --llamacloud, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --llamacloud, streaming)
  • GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --example-file, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --example-file, streaming)
  • GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --no-files, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --no-files, streaming)
  • GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --llamacloud, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --llamacloud, streaming)
  • GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --example-file, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --example-file, streaming)
  • GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --no-files, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --no-files, streaming)
  • GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --llamacloud, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --llamacloud, streaming)
  • GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --example-file, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --example-file, streaming)
  • GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --no-files, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --no-files, streaming)
  • GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --llamacloud, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --llamacloud, streaming)
  • GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --example-file, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --example-file, streaming)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --llamacloud, llamaindexserver)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --llamacloud, streaming)
  • GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --no-files, llamaindexserver)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --example-file, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --no-files, streaming)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --example-file, streaming)
  • GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --llamacloud, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --llamacloud, streaming)
  • GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --example-file, llamaindexserver)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, --llamacloud, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --example-file, streaming)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, --llamacloud, streaming)
  • GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --no-files, llamaindexserver)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, --example-file, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --no-files, streaming)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, --example-file, streaming)
  • GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --llamacloud, llamaindexserver)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, --no-files, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --llamacloud, streaming)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, --no-files, streaming)
  • GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --example-file, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --example-file, streaming)
  • GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --no-files, llamaindexserver)
  • GitHub Check: python (20, 3.11, macos-latest, fastapi, --example-file, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --no-files, streaming)
  • GitHub Check: python (20, 3.11, macos-latest, fastapi, --example-file, streaming)
🔇 Additional comments (15)
.changeset/fifty-badgers-jog.md (1)

1-3: Changeset front-matter is correctly formatted.
The patch declaration and metadata follow the expected Changesets syntax.

packages/create-llama/templates/types/llamaindexserver/nextjs/package.json (1)

15-15:

❓ Verification inconclusive

Verify the new dependency version.
Ensure that @llamaindex/[email protected] is published and compatible with the rest of the template. Run a registry query to confirm availability and check for any breaking changes or advisories before bumping this dependency.


🏁 Script executed:

#!/bin/bash
# Check available versions for @llamaindex/tools
npm view @llamaindex/tools versions --json

Length of output: 135


Confirm @llamaindex/[email protected] Availability & Compatibility

  • Verified via
    npm view @llamaindex/tools versions --json
    that 0.0.9 is published.
  • Next steps before merging:
    • Review the CHANGELOG/release notes (on npm or the GitHub repo) for any breaking changes.
    • Inspect the package metadata:
      npm view @llamaindex/[email protected]
    • Run npm audit or your preferred vulnerability/scanning tool against the template.
    • Perform a local build and smoke tests of the create-llama/templates/types/llamaindexserver/nextjs scaffold to confirm nothing breaks.

File:
packages/create-llama/templates/types/llamaindexserver/nextjs/package.json

- "@llamaindex/tools": "0.0.8",
+ "@llamaindex/tools": "0.0.9",
.prettierignore (1)

10-11: Approve ignore patterns for test artifacts.
Excluding **/playwright-report/ and **/test-results/ prevents transient test outputs from being formatted, which is aligned with typical CI-generated directories.

packages/create-llama/helpers/python.ts (3)

572-573: Confirm updated source path for Python templates.
The cwd has changed to components/use-cases/python/<useCase>. Please verify that packages/create-llama/templates/components/use-cases/python/ contains subfolders for each supported use case to avoid missing template errors.


578-579: Validate UI components copy path.
The helper now targets components/ui/use-cases/<useCase>. Ensure each use-case directory under packages/create-llama/templates/components/ui/use-cases/ exists and includes the expected UI files.


609-610: Check README template relocation.
README-template.md is now sourced from components/use-cases/python/<useCase>. Confirm that every use-case folder includes this file so generated projects receive the proper documentation.

packages/create-llama/helpers/typescript.ts (3)

34-43: Update looks good

The path change from "workflows" to "use-cases" and the modification to copy all files with the asset relocator ensures proper template installation.


46-49: Path update is consistent

This path update for UI components maintains consistency with the directory structure changes.


51-52: Helpful clarifying comment

The comment clearly explains when the generate.ts file is overridden, improving code maintainability.

packages/create-llama/templates/components/use-cases/typescript/agentic_rag/src/generate.ts (1)

1-40: Implementation looks good

The generate.ts file correctly implements data ingestion and vector store index creation for the agentic RAG use case with proper error handling and user feedback.

packages/create-llama/e2e/typescript/resolve_dependencies.spec.ts (5)

19-21: Default template type setup

The default value for templateType ensures tests can run without environment variables, which is good for development.


22-27: Array of use cases for testing

Good comprehensive list of use cases to test the template generation functionality.


54-57: Appropriate test skipping logic

The conditional test skipping for incompatible template types prevents false test failures.


68-90: Enhanced test coverage

The nested test structure with both llamaParse enabled and disabled provides thorough coverage of all use cases.


93-98: Improved function signature

Converting from positional parameters to an options object improves code readability and maintainability.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

♻️ Duplicate comments (1)
packages/create-llama/e2e/typescript/resolve_dependencies.spec.ts (1)

19-21: TODO has been addressed.

The TODO comment from the previous review about "Moving templateType to CI matrix" has been addressed by adding the TEMPLATE_TYPE environment variable.

🧹 Nitpick comments (2)
packages/create-llama/e2e/typescript/resolve_dependencies.spec.ts (2)

46-47: Consider adding a comment explaining why retries are disabled.

Setting retries: 0 might lead to flaky CI runs if tests occasionally fail due to external factors.

-  test.describe.configure({ retries: 0 });
+  // Disable retries to catch any dependency issues early
+  test.describe.configure({ retries: 0 });

83-91: Add a comment explaining why LlamaParse tests are skipped for 'artifacts'.

This condition properly skips LlamaParse tests for the "artifacts" use case, but a comment would improve code clarity.

-      // Skipping llamacloud for the use case doesn't use index.
+      // Skip LlamaParse tests for "artifacts" since it doesn't use indexing features
+      // and doesn't require LlamaCloud services (consistent with simple.ts prompts)
       if (useCase !== "artifacts") {
📜 Review details

Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 1780895 and 336e3e3.

📒 Files selected for processing (2)
  • packages/create-llama/e2e/typescript/resolve_dependencies.spec.ts (2 hunks)
  • packages/create-llama/questions/simple.ts (1 hunks)
🧰 Additional context used
🧬 Code Graph Analysis (1)
packages/create-llama/e2e/typescript/resolve_dependencies.spec.ts (2)
packages/create-llama/helpers/types.ts (4)
  • TemplateFramework (29-29)
  • TemplateType (22-28)
  • TemplateUseCase (53-61)
  • TemplateVectorDB (31-41)
packages/create-llama/e2e/utils.ts (2)
  • createTestDir (160-164)
  • runCreateLlama (39-158)
⏰ Context from checks skipped due to timeout of 90000ms (55)
  • GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --llamacloud, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --llamacloud, streaming)
  • GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --example-file, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --example-file, streaming)
  • GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --no-files, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --no-files, streaming)
  • GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --llamacloud, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --llamacloud, streaming)
  • GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --example-file, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --example-file, streaming)
  • GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --no-files, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --no-files, streaming)
  • GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --llamacloud, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --llamacloud, streaming)
  • GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --example-file, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --example-file, streaming)
  • GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --no-files, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --no-files, streaming)
  • GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --llamacloud, llamaindexserver)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --llamacloud, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --llamacloud, streaming)
  • GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --example-file, llamaindexserver)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --llamacloud, streaming)
  • GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --example-file, streaming)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --example-file, llamaindexserver)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --example-file, streaming)
  • GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --no-files, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --no-files, streaming)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --no-files, llamaindexserver)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --no-files, streaming)
  • GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --llamacloud, llamaindexserver)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, --llamacloud, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --llamacloud, streaming)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, --llamacloud, streaming)
  • GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --example-file, llamaindexserver)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, --example-file, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --example-file, streaming)
  • GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --no-files, llamaindexserver)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, --example-file, streaming)
  • GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --no-files, streaming)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, --no-files, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --llamacloud, llamaindexserver)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, --no-files, streaming)
  • GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --llamacloud, streaming)
  • GitHub Check: python (20, 3.11, macos-latest, fastapi, --llamacloud, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --example-file, llamaindexserver)
  • GitHub Check: python (20, 3.11, macos-latest, fastapi, --llamacloud, streaming)
  • GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --example-file, streaming)
  • GitHub Check: python (20, 3.11, macos-latest, fastapi, --example-file, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --no-files, llamaindexserver)
  • GitHub Check: python (20, 3.11, macos-latest, fastapi, --example-file, streaming)
  • GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --no-files, streaming)
  • GitHub Check: python (20, 3.11, macos-latest, fastapi, --no-files, llamaindexserver)
  • GitHub Check: python (20, 3.11, macos-latest, fastapi, --no-files, streaming)
  • GitHub Check: lint
🔇 Additional comments (5)
packages/create-llama/questions/simple.ts (1)

79-93: LGTM! Improved user experience for 'artifacts' app type.

This change correctly skips the LlamaCloud services prompt when the user selects the "artifacts" app type, which makes sense since this type doesn't require LlamaCloud integration based on its definition in the lookup table at lines 156-161.

packages/create-llama/e2e/typescript/resolve_dependencies.spec.ts (4)

6-11: LGTM! Proper type imports.

Adding these specific type imports from helpers/types.ts improves type safety and makes the code more maintainable.


50-57: LGTM! Improved test descriptions and proper test skipping.

Including the template type in the test description and conditionally skipping vector DB tests for the "llamaindexserver" template type improves test clarity and efficiency.


67-93: LGTM! Comprehensive test coverage for different use cases.

This new test suite properly tests each use case with and without LlamaParse, and correctly skips LlamaParse tests for the "artifacts" use case, maintaining consistency with the changes in simple.ts.


96-157: LGTM! Improved function signature with options object.

Refactoring the runTest function to accept a single options object instead of multiple parameters makes the code more maintainable and easier to extend in the future.

…on data source and refining use case tests for example data source
Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 0

♻️ Duplicate comments (2)
packages/create-llama/e2e/python/resolve_dependencies.spec.ts (2)

165-187: Replace nested describe in loop with test().

Using a nested describe inside a loop creates asynchronous side-effects during test declaration and complicates the test hierarchy.

Apply this diff to fix the issue:

-for (const observability of observabilityOptions) {
-  test.describe(`observability: ${observability} ${templateType}`, async () => {
-    const cwd = await createTestDir();
-
-    const { pyprojectPath } = await createAndCheckLlamaProject({
-      options: {
-        cwd,
-        templateType: "streaming",
-        templateFramework,
-        dataSource: "--example-file",
-        vectorDb: "none",
-        tools: "none",
-        port: 3000,
-        postInstallAction: "none",
-        templateUI: undefined,
-        appType: "--no-frontend",
-        llamaCloudProjectName: undefined,
-        llamaCloudIndexName: undefined,
-        observability,
-      },
-    });
-  });
+for (const observability of observabilityOptions) {
+  test(`observability: ${observability} ${templateType}`, async () => {
+    const cwd = await createTestDir();
+
+    const { pyprojectPath } = await createAndCheckLlamaProject({
+      options: {
+        cwd,
+        templateType: "streaming",
+        templateFramework,
+        dataSource: "--example-file",
+        vectorDb: "none",
+        tools: "none",
+        port: 3000,
+        postInstallAction: "none",
+        templateUI: undefined,
+        appType: "--no-frontend",
+        llamaCloudProjectName: undefined,
+        llamaCloudIndexName: undefined,
+        observability,
+      },
+    });
+    // Optional: Add assertions for observability packages if needed
+  });
}

239-249: Large command output can overflow Node's default exec buffer.

The execAsync calls for UV operations inherit the default maxBuffer (1 MiB), which can be exceeded by uv sync or mypy output.

Apply this diff to increase the buffer size for all three command executions:

-const { stdout: venvStdout, stderr: venvStderr } = await execAsync(
-  "uv venv",
-  { cwd: projectPath, env: commandEnv },
-);
+const { stdout: venvStdout, stderr: venvStderr } = await execAsync(
+  "uv venv",
+  { cwd: projectPath, env: commandEnv, maxBuffer: 10 * 1024 * 1024 }, // 10 MiB
+);

// (Apply similar changes to the other two execAsync calls below)

Make similar changes to the uv sync and uv run mypy calls to prevent buffer overflow errors.

Also applies to: 253-262, 266-277

🧹 Nitpick comments (2)
packages/create-llama/e2e/python/resolve_dependencies.spec.ts (2)

217-283: Well-structured helper function with improved return type.

The updated createAndCheckLlamaProject function is well-structured and now correctly returns both the project path and pyproject.toml path, which is useful for assertions in the calling tests.

However, consider adding more robust error handling and validation. For example, you could parse the pyproject.toml file to verify its structure, or check for specific error patterns in the command outputs.


99-99: Remove redundant comment markers.

There are redundant double slashes in the comments on lines 99 and 131.

-// // Test tools
+// Test tools

-// // Test data sources
+// Test data sources

Also applies to: 131-131

📜 Review details

Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 336e3e3 and 53ea76a.

📒 Files selected for processing (2)
  • packages/create-llama/e2e/python/resolve_dependencies.spec.ts (2 hunks)
  • packages/create-llama/e2e/typescript/resolve_dependencies.spec.ts (2 hunks)
🚧 Files skipped from review as they are similar to previous changes (1)
  • packages/create-llama/e2e/typescript/resolve_dependencies.spec.ts
🧰 Additional context used
🧬 Code Graph Analysis (1)
packages/create-llama/e2e/python/resolve_dependencies.spec.ts (2)
packages/create-llama/helpers/types.ts (4)
  • TemplateFramework (29-29)
  • TemplateType (22-28)
  • TemplateUseCase (53-61)
  • TemplateVectorDB (31-41)
packages/create-llama/e2e/utils.ts (3)
  • createTestDir (160-164)
  • RunCreateLlamaOptions (21-37)
  • runCreateLlama (39-158)
⏰ Context from checks skipped due to timeout of 90000ms (54)
  • GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --llamacloud, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --llamacloud, streaming)
  • GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --example-file, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --example-file, streaming)
  • GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --no-files, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --no-files, streaming)
  • GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --llamacloud, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --llamacloud, streaming)
  • GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --example-file, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --example-file, streaming)
  • GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --no-files, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --no-files, streaming)
  • GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --llamacloud, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --llamacloud, streaming)
  • GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --example-file, llamaindexserver)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --llamacloud, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --example-file, streaming)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --llamacloud, streaming)
  • GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --no-files, llamaindexserver)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --example-file, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --no-files, streaming)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --example-file, streaming)
  • GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --llamacloud, llamaindexserver)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --no-files, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --llamacloud, streaming)
  • GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --example-file, llamaindexserver)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --no-files, streaming)
  • GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --example-file, streaming)
  • GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --no-files, llamaindexserver)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, --llamacloud, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --no-files, streaming)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, --llamacloud, streaming)
  • GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --llamacloud, llamaindexserver)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, --example-file, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --llamacloud, streaming)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, --example-file, streaming)
  • GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --example-file, llamaindexserver)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, --no-files, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --example-file, streaming)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, --no-files, streaming)
  • GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --no-files, llamaindexserver)
  • GitHub Check: python (20, 3.11, macos-latest, fastapi, --llamacloud, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --no-files, streaming)
  • GitHub Check: python (20, 3.11, macos-latest, fastapi, --llamacloud, streaming)
  • GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --llamacloud, llamaindexserver)
  • GitHub Check: python (20, 3.11, macos-latest, fastapi, --example-file, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --llamacloud, streaming)
  • GitHub Check: python (20, 3.11, macos-latest, fastapi, --example-file, streaming)
  • GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --example-file, llamaindexserver)
  • GitHub Check: python (20, 3.11, macos-latest, fastapi, --no-files, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --example-file, streaming)
  • GitHub Check: python (20, 3.11, macos-latest, fastapi, --no-files, streaming)
  • GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --no-files, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --no-files, streaming)
🔇 Additional comments (3)
packages/create-llama/e2e/python/resolve_dependencies.spec.ts (3)

6-22: Added typing imports and template configuration look good.

The additional imports and default configuration for template types and use cases are well-structured. Using environment variables with sensible defaults is a good practice for test flexibility.


27-61: Well-structured test organization with conditional skipping.

The nested test structure with conditional skipping based on template type is a good approach for organizing template-specific tests. The comprehensive test matrices for vector databases, tools, data sources, and observability options provide good coverage.


191-215: Good implementation of LlamaIndexServer test suite with use cases.

The test structure for LlamaIndexServer with conditional skipping and iterating through different use cases is well-implemented. This ensures proper testing of the new template type.

@leehuwuj leehuwuj merged commit b3eb0ba into main May 13, 2025
60 checks passed
@leehuwuj leehuwuj deleted the lee/typing-issue branch May 13, 2025 09:20
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants