Navigating AI Now Could Pay Big Dividends Later for Your TTO
Using ChatGPT and other public AI systems for tasks involving intellectual property can be risky and should be done with caution. But gaining experience with those open systems now could help your tech transfer office (TTO) capitalize on the benefits of a private AI system when those become more widely available. And that could happen sooner than you think.
The key to navigating these waters is understanding the difference between open AI systems like ChatGPT, which “learn” from a broad spectrum of publicly available information and generate responses that anyone can see, and closed AI systems in which a tech transfer office can tightly control what information the system is given and how it is shared.
For tech transfer offices, a closed AI system is the holy grail, as AUTM Eastern Region Meeting attendees heard last month at a presentation detailing the NYU Langone Health experience with a private system. After just a few months training the system, NYU’s tech transfer staff can now review up to 30 agreements per hour and have identified more than $1 million in outstanding obligations, according to Marc Sedam, Vice President of Technology Opportunities and Ventures at NYU.
“I suspect within 16 to 18 months most tech transfer offices will have some kind of private system,” Sedam said. “The return we’re generating so far outstrips what the cost of an investment would be.”
Currently, with closed AI systems still rare among tech transfer offices, many are still weighing the benefits and pitfalls of open AI systems.
“AI is an extremely useful tool that will undoubtedly impact invention, innovation and commercialization,” said Cameron Smith, Assistant Vice President – Research Commercialization at Texas Tech University, whose office hosted a Lunch & Learn panel discussion on generative AI last month that drew 115 attendees. “We have used it internally for operations and proposal development. But there are still quite a few unknowns. Faculty inventors and even tech transfer professionals should proceed with caution when inputting proprietary ideas or novel concepts into generative AI platforms, which could affect the patentability of your technology.”
An AUTM eGroup exchange in August indicated that many Members were considering use of generative AI in their tech transfer work but shared Smith’s concerns about sharing confidential information with an open AI system, as well as the possibility that an open AI system’s responses would be biased or even nonsensical “hallucinations.”
In the eGroup thread, some Members said their universities had issued (usually via the IT department) AI-specific policies or statements pointing to existing policies that would be relevant to AI use, but that TTOs should consider establishing their own policy or standard operating procedure for working with an open AI system.
“There are additional precautions that staff should know before engaging with AI for business purposes, which is different from standard university IT policies,” said Mandy Checkai, Applications Development Manager at Wisconsin Alumni Research Foundation (WARF). “In particular, these precautions include the need for results to be fully reviewed by a human to ensure the AI is not being biased or inaccurate.”
Despite their limitations, having experience with the open AI systems will put TTOs in position to make the most of a closed AI system when those become more widely available, Sedam said.
“It’s coming faster than anyone thinks, so finding ways to play with the public versions now so you’re ready later is smart,” he said. “You can create templates and prompts using non-proprietary information and get it to work, then wait to implement those processes when you have a private system.”
Learn more about generative AI and its tech transfer potential at the 2024 AUTM Annual Meeting, which will include three sessions on this topic.