![]() ![]() ![]() Important: Stremio Web will still not be fully featured without connecting it to Stremio Service (which is unavailable for iOS at this point), some missing features include: converting videos to support the browser, torrent streaming, etc Tidalflow will remain in closed beta for now, with plans to launch commercially to the public by the end of 2023.You can now use Stremio Web on iPhone and iPad, which offers a more complete experience than the Stremio Organizer app which is available in the App Store. “Much like the iPhone ushered in a new era for mobile-friendly software in 2007, we’re now at a similar inflection point, namely for software to become LLM-compatible,” Jorna noted. With ChatGPT getting an API and support for third-party plugins, Google on its way to doing the same for the Bard ecosystem and Microsoft embedding its Copilot AI assistant across Microsoft 365, businesses and developers have a big opportunity to not just leverage generative AI for their own products, but reach a vast amount of users in the process. But with a fresh $1.7 million in funding, Jorna said the company is now actively looking to recruit for various front- and back-end engineering roles as they work toward a full commercial launch.īut if nothing else, the fast turnaround from foundation to funding is indicative of the current generative AI gold rush. Today, Tidalflow claims a team of three, including its two co-founders and chief product officer (CPO) Henry Wynaendts. “Once the official program started in the summer, Tidalflow became the quickest company in Antler Netherlands’ history to get funded,” Jorna said. Tidalflow is officially three months old, with founders Jorna (CEO) and Coen Stevens (CTO) meeting through Antler‘s entrepreneur-in-residence program in Amsterdam. Tidalflow’s Coen Stevens (CTO), Sebastian Jorna (CEO) and Henry Wynaendts (CPO). And out the other end Tidalflow spits out a “battle-tested LLM-instance” of that product, with the front-end serving up monitoring and observability of how that LLM-instance will perform in the wild. Tidalflow can perhaps best be described as an application lifecycle management (ALM) platform that companies plug their OpenAPI specification / documentation into. Tidalflow’s testing and simulation module builds that confidence.” “This lack of confidence in the reliability of their software is a major roadblock to rolling out software tooling into LLM ecosystems. “The big problem is, if you launch on something like ChatGPT, you actually don’t know how the users are interacting with it,” Tidalflow CEO Sebastian Jorna told TechCrunch. They can also fine-tune the LLM-instance of their product for each ecosystem in a local simulated sandboxed environment, until they arrive at a solution that meets something amenable to their fail-tolerance threshold. This is where Tidalflow enters the fray, with modules that help companies not only create their LLM-instance, but test, deploy, monitor, secure and - eventually - monetize it. ![]() Now, if a company has a fail tolerance of less than 1%, they might just feel safer not going down the generative AI route until they have greater clarity on how their LLM-instance is actually performing. So the company creates an LLM-instance for each, but for all they know, 2% of ChatGPT results serve up a destination that the customer didn’t ask for, an error rate that might be even higher on Bard - it’s just impossible to know for sure. ConfidenceĬonsider this hypothetical scenario: An online travel platform decides it wants to embrace LLM-enabled chatbots such as ChatGPT and Google’s Bard, allowing its customers to request airfares and book tickets through natural language prompts in a search engine. The fledgling startup is emerging out of stealth today with $1.7 million in a round of funding co-led by Google’s Gradient Ventures alongside Dig Ventures, a VC firm set up my MuleSoft founder Ross Mason, with participation from Antler. While a company can already create an “LLM-instance” of their software based on their current API documentation, the problem is that they need to ensure that the broader LLM ecosystem can use it properly - and get enough visibility into how well this instance of their product actually works in the wild.Īnd that, effectively, is what Tidalflow is setting out to solve, with an end-to-end platform that enables developers to make their existing software play nice with the LLM ecosystem. Much in the same way as companies adapt their software to run across different desktop, mobile and cloud operating systems, businesses also need to configure their software for the fast-moving AI revolution, where large language models (LLMs) have emerged to serve powerful new AI applications capable of interpreting and generating human-language text.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |