Feature Requests

Support for Function Calling in OpenAI's API
I am an enthusiastic user of Chatbase and value its capabilities immensely. However, I recently came across a new tool that has been launched by OpenAI, which I believe would greatly enhance the utility and efficiency of Chatbase. OpenAI has introduced 'Function Calling' for their models gpt-4-0613 and gpt-3.5-turbo-0613. With this, developers can describe functions to these models, which can then intelligently output a JSON object containing arguments to call those functions. This functionality serves to create a more reliable connection between GPT's capabilities and external tools and APIs. A significant aspect of this feature is that it allows the models to detect when a function needs to be called (based on user input) and to respond with JSON adhering to the function signature. This opens up exciting possibilities like creating chatbots that answer questions by calling external tools, converting natural language into API calls or database queries, and even extracting structured data from text. This ability to convert natural language into actionable API calls will substantially augment our interaction with the Chatbase platform. Considering the power and versatility this new feature brings, I kindly request the Chatbase team to consider implementing the ability to pass Function Calling via your API. This would seamlessly integrate with OpenAI's API, bringing considerable value-addition to your current and future customers. I am confident that the integration of this feature would significantly elevate the experience of using Chatbase and amplify the range of functionalities it currently offers. Link to OpenAI blog post: https://openai.com/blog/function-calling-and-other-api-updates
·

planned

Delayed Message Delivery - Human like conversation flow - enhance user satisfaction and lead generation effect.
Currently, the chat delivers messages in chunks, which can create a choppy and unnatural conversation flow, especially when interacting with a bot. While we don't aim to hide the bot's nature, a more seamless communication experience can enhance user satisfaction and lead generation effect. Proposed Solution: Introduce a delay mechanism for message delivery, allowing the bot to compose messages in their entirety before sending them to the user. This delay can be: Fixed: A predetermined delay of 3-10 seconds, simulating a more natural typing pace. User-Configurable: Allow users to adjust the delay duration based on their preference, striking a balance between realism and response time. Benefits: Improved User Experience: A smoother message delivery creates a more engaging and human-like conversation flow, even when interacting with a bot. Enhanced Realism: The delay mimics natural typing patterns, making the bot's responses feel less robotic and more conversational. Customization: The optional nature of this feature allows users to tailor the chat experience to their liking. Additional Considerations: Visual Feedback: During the delay, provide a visual indicator (e.g., typing indicator, progress bar) to inform the user that the bot is composing a message. Timeout: Set a maximum delay duration to prevent excessive waiting times for users who prefer quicker responses. Analytics: Track user preferences for delay settings to optimize default values and overall chat experience.
Load More