AI research support

Accelerate,
optimize and
discover with AI

As ASU researchers set out to solve humanity’s greatest challenges, our goal is to provide the best tools to empower them to succeed. AI is an essential part of the modern toolkit, offering new opportunities to advance research that matters to our communities today.

Knowledge Enterprise and Research Computing at ASU provide world-class AI models, tools and high-performance computing resources to faculty, students, staff and community members.

By opening access to tools and platforms across all disciplines, we foster an inclusive environment for AI research that joins scientific, humanistic and artistic exploration.

Our infrastructure, which includes the Sol supercomputer, offers the power and flexibility to push boundaries in AI research — whether you are developing large language models (LLMs), accelerating AI-driven software, or applying deep learning techniques to complex problems.

Boosting research with AI tools

Studying and sorting through large datasets is an important part of research. This is exactly where AI models offer a major advantage — they are ideally suited to analyze huge amounts of information with speed.

Research Computing provides local access to powerful open-source AI models on the Sol supercomputer, displayed in the table below. Researchers can use these models separately or in combination to examine datasets stored locally. This improves data security by eliminating the need for cloud uploads. 

Sol also carries a suite of AI-powered tools, including MONAI, AlphaFold3, TensorFlow and PyTorch. These provide faculty, students and staff with a robust computing environment for AI-enabled research.

ASU’s advanced computing resources not only drive innovation and discovery in AI research, they contribute to numerous scientific publications across various disciplines. Researchers leveraging ASU’s Sol supercomputer, local AI models and GPU acceleration have produced impactful studies in areas such as machine learning, large language models, biomedical AI and computational science. 

ModelRelease dateSpecialties and featuresParametersContext window

ClimateGPT

January 2024

Climate science-focused, adapted from Llama 2 with domain-specific training

7B, 13B, 70B

4k

Llama 3

April 2024

Improved reasoning, instruction-following, multilingual support

8B, 70B

8k
NVLM

September 2024

Vision-language model excels in OCR, multimodal reasoning and coding

72B

NA

Llama 3.3

December 2024

Optimized for multilingual dialogue, comparable to Llama 3.2 (405B)

70B

128k

DeepSeek R1

January 2025

Mixture of experts, strong reasoning, general-purpose and specialized tasks

1.5B70B (distilled), 671B (MoE, 37B active)

128k

Falcon 3

February 2025

Focus on science, math and coding, with various instruction-tuned and base models

1B, 3B, 7B, 10B

32k

Gemma 3

March 2025

Multimodal (text and image) instruction-tuned versions

1B, 4B, 12B, 27B

32k (1B), 128k (others)

Choosing the Right AI Model: What to Know

Before diving into the model comparisons, here are terms to help interpret the table:

 

  • Parameters: Parameters are the "brainpower" of an AI model. The more parameters a model has, the more information it can process and the more complex patterns it can learn.
  • Context Window: This refers to how much text a model can consider at once. A larger context window allows the model to understand and respond to longer inputs. This is useful for analyzing documents, conducting in-depth conversations, or performing multi-step reasoning.

Model Selection Tips

Different research needs call for different models. Here are some general guidelines:

 

  • For fast, resource-efficient tasks consider smaller models like Falcon or LLAMA with fewer parameters.
  • For working with long documents or multi-step prompts, choose a model with a large context window (e.g., LLAMA 3.3, DeepSeek-V3, or Grok-3).
     
  • For research requiring high performance in reasoning, coding, or technical language, Grok-3 or DeepSeek-V3 are good choices.
     
  • For projects requiring integration with NVIDIA GPU infrastructure, consider NVIDIA NVLM.
    For multilingual work, LLAMA 3.3 supports multiple languages and is good for cross-language tasks.

If you’re unsure where to start, LLAMA models offer a solid balance of performance and flexibility across many research domains.

 

Getting Started


All AI models listed on this page are freely available for ASU researchers on the Sol Supercomputer. To use the AI models, you must first request access to the Sol Supercomputer. Faculty and research staff may request accounts directly. Students and non-faculty users must be sponsored by an ASU faculty member. Once your account is approved, you will receive onboarding instructions and access credentials.


Next, to begin using the models, schedule a consultation with the Research Computing team. We’ll walk you through accessing the Sol Supercomputer via the web portal, launching a Jupyter Notebook, and selecting the appropriate tools for your research. This personalized guidance will help ensure you're set up for success—especially if you're new to working with AI models in a supercomputing environment.
 

Bringing AI to the classroom

ASU equips students to engage in hands-on AI learning, giving them valuable experience with technology that continues to shape the modern workforce. Students can access AI applications on the Sol supercomputer if their instructors request student accounts from Research Computing, which provides AI computing support for academic courses.

Students with accounts can immediately begin working with cutting-edge AI tools without instructors needing to handle software setup. Research Computing pre-configures software environments, including several Python environments and ready-to-use Jupyter notebooks integrated with AI models. 

Below is a sample of AI-related courses supported by Research Computing in the last academic year.


A college student sits in a bright academic building with his laptop.

 

Course NumberCourse NameCourse Date
EEE 549Statistical Machine Learning: From Theory to PracticeFall 2024
CSE 575Statistical Machine LearningFall 2024
CSE 598Frontier topics in GenAIFall 2024
EEE598Algorithm/Hardware CoDesign/Design Automation for Emerging AI HardwareFall 2024
EEE598Deep Learning: Foundations and ApplicationsFall 2024
CSE 576Topics in Natural Language ProcessingFall 2024
CSE 575Statistical Machine LearningFall 2024
EEE 598Generative AI: Theory and PracticeSpring 2025
CIS 508Machine Learning in BusinessSpring 2025
CEN 524, CSE 524, CSE 494Machine Learning AccelerationSpring 2025
CSE 575Statistical Machine LearningSpring 2025
CSE 476Introduction to Natural Language ProcessingSpring 2025
MFG 598AI in Additive ManufacturingSpring 2025
CSE 576Topics in Natural Language ProcessingSpring 2025
FIN 597AI and Machine Learning Applications in FinanceSpring 2025

 

Events


Research Computing provides cutting-edge resources, training and events to empower researchers in the rapidly evolving field of AI.

AI training and workshops


In addition to these flagship events, Research Computing hosts dozens of interactive training sessions each year, many covering a range of AI topics, including:

  • Harnessing the potential of large language models for innovation
  • Accelerating research with GPUs
  • Advanced research acceleration with GPUs
  • Python: Machine learning
  • Python: Deep learning

For a full list of upcoming workshops, visit the Research Computing documentation site. Explore past workshop materials and recordings on the Research Computing Expo page.

Resources for researchers

AI-powered innovation

A chatbot for cancer questions

Professor Chitta Baral’s team is building a chatbot to answer people’s questions about prostate cancer. To create the basic AI model, they are first training a large language model using clinical data from the MIMIC project. Then, the team will make two versions of the model. One will get extra training with clinical guidelines and Q&A data about prostate cancer, while the other will only get the Q&A data. The team will test both versions to see which works better as a chatbot.

AI-powered innovation

Building AI skills

A group of 30 students received firsthand experience with machine learning in Assistant Professor Suren Jayasuriya’s course, Deep Learning: Foundations and Applications. Deep learning is a type of AI that learns from data using networks inspired by the human brain. Throughout the semester, students used powerful computers to train and test advanced AI models. Access to these tools helped them build skills in neural networks, model optimization and AI applications.

A safe AI environment

Many scientists want to leverage AI but also need to protect proprietary data. The university’s OpenAI API, offered through Research Computing, empowers ASU researchers to integrate OpenAI’s advanced models into their projects while ensuring their data remains private and owned by ASU.

This service supports applications in natural language processing, data analysis and AI-driven innovation. ASU provides an initial trial period at no cost. After the trial period, researchers are responsible for paying for continued access.

Student at a desk using a laptop

AI-powered innovation

Including older adults in generative AI

Many older adults and other groups that spend less time online do not contribute as much online content, making it harder for AI to learn from them. In a project funded by the National AI Research Resource Pilot, Associate Professor Yezhou "YZ" Yang aims to fix that. Yang’s team is developing new AI learning methods that help text- and image-generating models better understand and represent these groups. Working with the ASU Foundation, the team will test these methods with older adults in ASU’s lifelong learning programs. This will help generative AI models be more inclusive of underrepresented groups and less biased toward younger users.

Featured units

ASU experts

Search for ASU experts and scholarly works by topic at experts.asu.edu. Find university experts based on their academic journal publications, academic prizes, external engagements and other professional activities.

Search for experts

Did you know?


In 2020, ASU was named a Dell Technologies High Performance Computing and Artificial Intelligence Center of Excellence. This recognition gave the university access to a worldwide program that facilitates the exchange of ideas among researchers, computer scientists, technologists and engineers for the advancement of high-performance computing and artificial intelligence solutions. ASU later leveraged its relationship with Dell Technologies to co-design and build the Sol supercomputer announced in 2022.