For one company, large language models created a breakthrough in artificial intelligence (AI) by shifting to crafting prompts and utilizing APIs without a need for AI science expertise. To enhance developer experience and craft applications and tools, they defined and established principles around simplicity, immediate accessibility, security and quality, and cost efficiency.
Romain Kuzniak spoke about enhancing developer experience for creating AI applications at FlowCon France 2024.
Scaling their first AI application to meet the needs of millions of users presented a substantial gap, Kuzniak said. The transition required them to hire data scientists, develop a dedicated technical stack, and navigate through numerous areas where they lacked prior experience:
Given the high costs and extended time to market, coupled with our status as a startup, we had to carefully evaluate our priorities. There were numerous other opportunities on the table with potentially higher returns on investment. As a result, we decided to pause this initiative.
The breakthrough in AI came with the emergence of Large Language Models (LLMs) like ChatGPT, which shifted the approach to utilizing AI, Kuzniak mentioned. The key change that LLMs brought was a significant reduction in the cost and complexity of implementation:
With LLMs, the need for data scientists, data cleansing, model training, and a specific technical infrastructure diminishes. Now, we could achieve meaningful engagement by simply crafting a prompt and utilizing an API. No need for AI science expertise.
Kuzniak mentioned that enhancing the developer experience is as crucial as improving user experience. Their goal is to eliminate any obstacles in the implementation process, ensuring a seamless and efficient development flow. They envisioned the ideal developer experience, focusing on simplicity and effectiveness:
For the AI implementation, we’ve established key principles:
- Simplicity: enable implementation with just one line of code.
- Immediate Accessibility: allow real-time access to prompts without the need for deployment.
- Security and Quality: integrate security and quality management by design.
- Cost Efficiency: design cost management and thresholds into the system by default.
Kuzniak mentioned that their organizational structures are evolving in the face of the technology landscapes. The traditional cross-functional teams comprising product managers, designers, and developers, while still relevant, may not always be the optimal setup for AI projects, as he explained:
We should consider alternative organizational models. The way information is structured and its subsequent impact on the quality of outcomes, for example, has highlighted the need for potentially new team compositions. For instance, envisioning teams that include AI product managers, content designers, and prompt engineers could become more commonplace.
Kuzniak advised applying the same level of dedication and best practices to improve the internal user experience as you would for your external customers. Shift towards a mindset where your team members consider their own ideal user experience and actively contribute to creating it, he said. This approach not only elevates efficiency and productivity, but also significantly enhances employee satisfaction and retention, he concluded.
InfoQ interviewed Romain Kuzniak about developing AI applications.
InfoQ: How do your AI applications look?
Romain Kuzniak: Our AI applications are diverse, with a stronger focus on internal use, particularly given our nature as an online school generating substantial content. We prioritize making AI tools easily accessible to the whole company, notably integrating them within familiar platforms like Slack. This approach ensures that our staff can leverage AI seamlessly in their daily tasks.
Additionally, we’ve developed a prompts catalogue. This initiative encourages our employees to leverage existing work, fostering an environment of collective intelligence and continuous improvement.
Externally, we’ve extended the benefits of AI to our users through the introduction of a student AI companion for example. This tool is designed to enhance the learning experience by providing personalized support and guidance, helping students navigate their courses more effectively.
InfoQ: What challenges do you currently face with AI applications and how do you deal with them?
Kuzniak: Among the various challenges we face with AI applications, the most critical is resisting the temptation to implement AI for its own sake, especially when it adds little value to the product. Integrating AI features because they’re trendy or technically feasible can divert focus from what truly matters: the value these features bring to our customers. We’ve all encountered products announcing their new AI capabilities, but how many of these features genuinely enhance user experience or provide substantial value?
Our approach to this challenge is rooted in fundamental product management principles. We continuously ask ourselves what value we aim to deliver to our customers and whether AI is the best means to achieve this goal. If AI can enhance our offerings in meaningful ways, we’ll embrace it. However, if a different approach better serves our users’ needs, we’re equally open to that.