Open-AutoGLM
✨An open-source intelligent assistant framework for mobile devices that understands screen content through multimodal methods and performs automated operations to help users complete tasks.
An open-source intelligent assistant framework for mobile devices that understands screen content through multimodal methods and performs automated operations to help users complete tasks.
A groundbreaking visual AI development environment for building no-code data pipelines and multimodal agents with real-time capabilities, social connectors, and AI-powered tools.
AIlice is a fully autonomous, general-purpose AI agent based on open-source LLMs. Using its unique Interactive Agents Call Tree (IACT) architecture, it decomposes complex tasks into dynamically constructed agents with high fault tolerance, enabling seamless task execution and result integration.
An AI-powered local automation tool that uses natural language to make computers work autonomously, understanding screen content and performing operations like humans, without requiring programming knowledge for complex automation workflows.
ComfyUI-Copilot is an AI-powered assistant for ComfyUI that provides comprehensive support for workflow creation, debugging, optimization, and parameter tuning throughout the entire development lifecycle, helping users efficiently build and optimize AI creative workflows.
An open-source multimodal AI Agent stack developed by ByteDance, comprising the general Agent TARS framework and the UI-TARS Desktop client. It enables natural language control of computers, browsers, and terminals via Vision-Language Models.
Page 1 / 1 · 6 total
Get the latest AI tools and trends delivered straight to your inbox. No spam, just intelligence.