Microsoft, in conjunction with OpenAI, may soon have a solution for coders that allows them to use natural language to auto-populate AI generated code for projects such as creating unique dialogue for NPCs in video games or assistive features for engineers or architects.
Microsoft owned GitHub highlighted its OpenAI driven Copilot project got some love from Bloomberg in piece titled Microsoft Wants AI To Change Your Job—If It Can Work Out the Kinks.
The piece written by Dina Bass, puts GitHub’s vision of AI driven code assistance front and center for developers, engineers, video game studios or organizations looking to cut out hours of maligned coding and replace that time saved with visionary freedom.
According to Microsoft, Copilot works best when developers are looking to fill in simple code-and that code can be found in GitHub’s opensource archives, but the promise is that eventually, developers will be able to select a programming language and start a search with a few lines of text in a natural language query for code that generates an address storing system, and will be presented with multiple lines of grey, italicized text to insert.
Sometime down the road, if things pan out as dreamt about, “conversations in games that often feel stilted or repetitive—from, say, villagers, soldiers and other background characters—could suddenly become engaging and responsive.”
Other use cases described for Copilot by Microsoft CEO Satya Nadella at Ignite include adding assistive features for architects and industrial design as well bringing back Clippy-esque type virtual assistants to Word, Excel or Teams to help automate tasks such as minutes recording for meetings and summarizing conversations.
Microsoft is also looking at Copilot as a beneficial defense mechanism to expound on cybersecurity measures faster than traditional human output. Vasu Jakkal, Microsoft’s security vice president has entered the company’s cybersecurity products team into early stages of figuring out how Copilots AI-driven results could help keep hackers at bay or mitigate ongoing threats.
The key to understanding the very specific threat of cybersecurity, Microsoft is acknowledging that Copilot could be made initially vulnerable by the very core of its valued technology in Codex. Codex is the mechanism that would enable natural language queries for coders. Microsoft engineers have demonstrated how Codex works to move code commands to voice extrapolations with Minecraft demo that took plain English commands and translated them to code to enable a game character to look, walk, and craft.
Copilot is not a panacea, and will present Microsoft, GitHub and OpenAI ethical and legal hurdles as AI ethics researcher Margaret Mitchell sums up the following, “One of the big problems with large language models is they’re generally trained on data that is not well documented.” As Microsoft learned painfully and publicly with its AI generated bot Tay, programmers can embed offensive speech in long bodies of code while hackers can figure out a way to subvert the original intentions of Copilot and substitute their own security vulnerabilities.
Another potential problem waiting in the wings is what does Copilot mean for the human-coding labor force?
Fortunately, Microsoft and GitHub don’t have to have answer for questions such as these as Copilot is still spitting out hilariously unoptimized answers some of the time that included a search for “the most corrupt company,” and Copilot spitting out Microsoft. Copilot also has to figure out its ethical concerns around copyright and the potential spread of security flaws. However, despite the ‘corrupt company’ blunder and unanswered outlying ethical questions, Cassidy Williams, CTO of AI startup Contenda, believes in the project and eagerly waits for it to come out of beta.