Comprehensive 再利用可能コンポーネント Tools for Every Need

Get access to 再利用可能コンポーネント solutions that address multiple requirements. One-stop resources for streamlined workflows.

再利用可能コンポーネント

  • Tangle.io leverages AI to provide a fast, secure, scalable low-code platform for enterprise applications.
    0
    0
    What is Tangle.io?
    Tangle.io is a cutting-edge low-code platform designed to reduce the cost, time, and risk in software projects. It offers an intuitive, fast, and scalable environment with AI integration and reusable components. The platform is aimed at accelerating application development while ensuring robust security and flexibility. Organizations can easily employ Tangle.io for a wide range of enterprise solutions, optimizing internal processes and enhancing overall productivity.
  • scenario-go is a Go SDK for defining complex LLM-driven conversational workflows, managing prompts, context, and multi-step AI tasks.
    0
    0
    What is scenario-go?
    scenario-go serves as a robust framework for constructing AI agents in Go by allowing developers to author scenario definitions that specify step-by-step interactions with large language models. Each scenario can incorporate prompt templates, custom functions, and memory storage to maintain conversational state across multiple turns. The toolkit integrates with leading LLM providers via RESTful APIs, enabling dynamic input-output cycles and conditional branching based on AI responses. With built-in logging and error handling, scenario-go simplifies debugging and monitoring of AI workflows. Developers can compose reusable scenario components, chain multiple AI tasks, and extend functionality through plugins. The result is a streamlined development experience for building chatbots, data extraction pipelines, virtual assistants, and automated customer support agents fully in Go.
  • Wizard Language is a declarative TypeScript DSL to define multi-step AI agents with prompt orchestration and tool integration.
    0
    0
    What is Wizard Language?
    Wizard Language is a declarative domain-specific language built on TypeScript for authoring AI assistants as wizards. Developers define intent-driven steps, prompts, tool invocations, memory stores, and branching logic in a concise DSL. Under the hood, Wizard Language compiles these definitions into orchestrated LLM calls, managing context, asynchronous flows, and error handling. It accelerates prototyping of chatbots, data retrieval assistants, and automated workflows by abstracting prompt engineering and state management into reusable components.
  • Open-source framework for building AI agents using modular pipelines, tasks, advanced memory management, and scalable LLM integration.
    0
    0
    What is AIKitchen?
    AIKitchen provides a developer-friendly Python toolkit enabling you to compose AI agents as modular building blocks. At its core, it offers pipeline definitions with stages for input preprocessing, LLM invocation, tool execution, and memory retrieval. Integrations with popular LLM providers allow flexibility, while built-in memory stores track conversational context. Developers can embed custom tasks, leverage retrieval-augmented generation for knowledge access, and gather standardized metrics to monitor performance. The framework also includes workflow orchestration capabilities, supporting sequential and conditional flows across multiple agents. With its plugin architecture, AIKitchen streamlines end-to-end agent development—from prototyping research ideas to deploying scalable digital workers in production environments.
  • Labs is an AI orchestration framework enabling developers to define and run autonomous LLM agents via a simple DSL.
    0
    0
    What is Labs?
    Labs is an open-source, embeddable domain-specific language designed for defining and executing AI agents using large language models. It provides constructs to declare prompts, manage context, conditionally branch, and integrate external tools (e.g., databases, APIs). With Labs, developers describe agent workflows as code, orchestrating multi-step tasks like data retrieval, analysis, and generation. The framework compiles DSL scripts into executable pipelines that can be run locally or in production. Labs supports interactive REPL, command-line tooling, and integrates with standard LLM providers. Its modular architecture allows easy extension with custom functions and utilities, promoting rapid prototyping and maintainable agent development. The lightweight runtime ensures low overhead and seamless embedding in existing applications.
Featured