AWS Adds Generative AI Tool to Automate Tasks for Developers
Amazon Web Services (AWS) this week make generally available an instance of a generative artificial intelligence (AI) assistant capable of executing complex workflows on behalf of application developers.
Doug Seven, general manager and director of AI developer experiences at AWS, said Amazon Q Developer employs the reasoning engines within large language models (LLMs) to automate a range of coding, testing, documentation, security, troubleshooting, optimization and upgrade tasks.
It takes advantage of a smaller, faster LLM designed specifically for software development and code writing to provide access to agents that are capable of asynchronously executing, for example, a suite of tests or, alternatively, migrate an application to the latest version of Java, said Seven.
Unlike a general-purpose LLM, Amazon Q Developer was trained using data sets that AWS fine-tuned to generate code, including examples of code that Amazon uses to deliver its services, to deliver more consistent outputs, he noted.
The overall goal is to reduce the current level of tool associated with building software to enable developers to solve more problems faster, said Seven.
Amazon Q Developer is part of a family of generative AI platforms dubbed Amazon Q that includes a version for business user and Amazon Q Apps, a technology preview of an instance of the PartyRock application development platform for businesses that enables subject matter experts to build applications on their own.
AWS previously unveiled Amazon Q last year but is now extending the reach of that platform into the realm of application development. Amazon Q essentially provides a layer of abstraction that makes the LLMs hosted on the Amazon Bedrock service more accessible, said Seven.
It’s not clear to what degree the level of automation now being enables by generative AI platform will impact the way DevOps teams are structured, but as more tasks become automated the overall pace at which applications are being constructed is about to dramatically increase, especially as more tasks that DevOps teams managed on behalf of developers are increasing automated.
At the same time, providers of DevOps platforms are investing heavily in AI to enable DevOps teams to deploy and manage applications at what will soon be a level of unprecedented scale.
As AI continues to advance, DevOps teams should be identifying a list of tasks that will soon be eliminated, many of which today create bottlenecks that slow down the pace of innovation.
Regardless of which application development platform is employed, software engineering teams will soon be augmented by any number of AI assistants. The next challenge will be determining how many AI assistants will be required as each provider of a platform makes one or more available. DevOps teams may decide to rely on a small number of AI assistants capable of managing tasks across multiple platforms or they may opt to orchestrate a larger number of AI assistants provided by multiple vendors.
One way or another, DevOps workflows will never be the same again as every member of a DevOps team gains an AI assistant capable of performing a wide range of tasks that rather than performing themselves they now simply have to supervise.