Tailoring Generative AI: A Secure Sandbox and The Need for Role-Specific Training and Resources

  • Published
  • By Dan Hawkins

JOINT BASE SAN ANTONIO-RANDOLPH, Texas – How the Air Force is driving adoption of generative artificial intelligence (AI) through tailored, role-based training to empower Airmen and Guardians to work more efficiently and effectively was presented to attendees during the Air Education and Training Command Force Development Summit here March 25. 

The effort, spearheaded by the Air Force Research Laboratory (AFRL),  focuses on equipping personnel with the knowledge and skills to use generative AI tools responsibly and ethically across a range of mission-critical functions.

"Generative AI holds immense potential to revolutionize how the DoD operates," said Courtney Klement, Digital Culture Transformation Lead at the Air Force Research Laboratory at Wright-Patterson Air Force Base, Ohio. “But its success hinges on responsible and strategic implementation tailored to the unique demands of each role."

A Secure Sandbox for Exploration and Data Collection

As a primary example of how AFRL is tackling the challenges of generative AI, Klement discussed a secure platform developed to allow DoD personnel to explore and experiment with Large Language Models (LLMs) in a controlled environment.

"This secure platform serves as a vital testing ground, allowing us to gather crucial data on user interactions and the impact of AI on decision-making, which in turn informs responsible AI adoption strategies across the DoD,” Klement noted. “This data allows the AFRL to understand how generative AI can be most effectively integrated into various workflows and to develop targeted training and resources that maximize its impact on mission success.”

By providing early and secure access to LLMs, this platform is paving the way for wider and more effective adoption of generative AI across the Air Force, Klement said.

Addressing the Need for Tailored AI Training

Recognizing that a one-size-fits-all approach won't unlock the full potential of generative AI, Klement highlighted the development of role-specific training resources.

"It's not enough to simply provide access to these powerful tools," she explained. "We need to equip our workforce with the knowledge to understand ethical considerations, craft effective prompts, and identify practical applications relevant to their daily tasks and responsibilities."

Role-Based Guides: Putting AI into Action

Central to the DoD's approach are a series of role-based guides being developed by AFRL that are designed to provide practical guidance and build user confidence.

"These guides offer clear examples, sample prompts, and use cases tailored to specific functional areas, empowering users to quickly understand how generative AI can be applied to their daily work,” Klement said.

Guides for supervisors & managers, administrative support staff, and human resources personnel are currently available in the sandbox. Guides for acquisitions, legal, and public affairs are slated for release soon, with the remaining guides expected by the end of April.

The AFRL is also launching hands-on workshops to complement the guides and provide practical training opportunities.

“By prioritizing role-specific training and resources, the DoD is fostering a culture of responsible and strategic AI adoption, ensuring its workforce remains at the forefront of innovation and maintains a competitive advantage on the global stage,” Klement said.

(Editor’s Note: Creation of this article assisted by NIPRGPT)

 
USAF/USSF logos