Tuesday, November 18, 2025

ODURF GRA support request

 Here’s a streamlined protocol you can follow next time you submit GRA support in the Research Foundation portal.


Protocol: Creating a GRA EPASS / Assignment in the Research Foundation Portal


0. Before you start


Have these items ready:

  • GRA’s full name and ODU email

  • Their home academic department/program (critical for routing)

  • Employee type: GR (Graduate Research Assistant)

  • Pay basis: semester or annual (you used semester basis)

  • Stipend for the semester (e.g., $11,000)

  • Hours per week: usually 20 hours

  • Funding project (RF project number)

  • Whether there is a tuition exemption, and if so:

    • Source (e.g., ODU Research Foundation)

    • Level (Master’s or Doctoral)


Important: You must know the student’s home department/program. The EPASS routes to that chair/dean for approval and cannot be changed later. If it’s wrong, the assignment must be deleted and recreated.


1. Start a new assignment

  1. Log in to the Research Foundation portal. https://hera.odurf.odu.edu/RFPortal 

  2. Go to “Research Assignments”.

  3. In the blue bar, click “Add Assignment”.


2. Add or select the GRA as an employee

  1. Next to Employee ID, click “Select”.

  2. Try typing the student’s name:

    • If found: select them.

    • If not found:

      • Click “Start a new employee” at the bottom.

      • Enter first name, last name, and email.

      • Save.

      • Then click “Select” again and choose the new employee.


3. Set employee type, department, and term

  1. Set Employee Type to GR.

  2. Choose Pay Basis:

    • For GRA by term, select Semester basis.

  3. Select Employee Department from the dropdown: (eg 6093 Computer Science)

    • This must be the student’s home department/program (not your department if they’re different).

    • Do not proceed until you are sure this is correct; it controls the routing path.

  4. Select the semester (e.g., Fall).

  5. Click “Save and Next”.


If the wrong department is chosen at this step, it cannot be edited later. The EPASS must be deleted and recreated.


4. Enter salary and hours

  1. In Annual/Term Salary, enter the semester stipend amount (since you selected semester basis).

  2. Enter Hours per Week = 20.


5. Set tuition exemption (if applicable)

  1. Locate the Tuition Exemption section.

  2. Select the appropriate option (e.g., ODU RF Tuition Exemption).

  3. Choose the degree level: Master’s or Doctoral (for your case: Doctoral).

  4. Indicate if you are covering 100% of tuition or another percentage, as required.


(Note: this tuition entry is separate from salary and fringe.)


6. Add the payline

  1. Scroll down to Payline and click “Add Payline”.

  2. Select the correct project from the list.

  3. For the payline details:

    • You can enter hours/week (e.g., 20) for the project.

    • Do not manually type the budget amount.

  4. To calculate salary for that payline:

    • Click “Calc” next to Budget on the right.

    • The system will calculate the salary based on the previously entered stipend and hours.

  5. Adjust any rounding (e.g., remove a $0.01 extra) if needed.

  6. Click “Create” to finalize the payline.


(This covers salary only – no tuition, no fringe.)


7. Review, edit, and submit

  1. Click “Save” to save the assignment.

  2. To review or change details:

    • Go to the top and click “Edit Assignment” (green button).

    • Confirm:

      • Employee type = GR

      • Correct home department

      • Semester basis and semester

      • Stipend amount and 20 hours/week

      • Tuition exemption details

      • Correct project and calculated payline

  3. When everything looks correct, click “Submit”.

  4. The status will show pending chair approval, routed through the student’s home department.


8. If the department is wrong

  • The department cannot be edited in an existing assignment.

  • The RF staff must delete the assignment, and you must create a new one with the correct department.

  • If you are unsure of the student’s department:

    • Check the offer letter, the program catalog, or contact the Graduate School / program.

    • You can also coordinate with RF staff to help verify if needed.


9. New hire paperwork

  • RF will request new hire paperwork from the student if needed.

  • Remind the student to complete all HR documents promptly so the GRA appointment can be processed on time.


Monday, November 17, 2025

Saturday, November 15, 2025

patent related to AI

 

Neural taxonomy expander

https://patents.google.com/patent/US20250259081A1/en


Wednesday, November 12, 2025

AMIA LLM workshop, material AI for Health

 

https://workshopamia2025.github.io/AMIA-KDDM-2025/


chrome-extension://efaidnbmnnnibpcajpcglclefindmkaj/https://workshopamia2025.github.io/AMIA-KDDM-2025/slides/AMIA%202025%20KDDM%20Workshop_Demo_Balu_Version%202.pdf



Monday, November 10, 2025

ROSMPA data

 


The links are

Command line: 
synapse get -r syn21311380

Python: 
import synapseclient
import synapseutils
syn = synapseclient.Synapse()
syn.login(authToken="YOUR_TOKEN_HERE")
files = synapseutils.syncFromSynapse(syn, 'syn21311380')

Wednesday, November 5, 2025

Spectral Foundations of Elastic Network Models

Elastic Network Models (ENMs) are deeply connected to spectral analysis, both mathematically and conceptually.
In fact, the core of ENM theory is spectral analysis of the system’s stiffness (Hessian) matrix.
Here’s a detailed explanation that connects ENM theory to classical and modern spectral methods across physics, engineering, and applied mathematics:


1. Spectral Foundation of ENM

When you build an ENM, you define a stiffness matrix ( H ) that encodes spring interactions between all node pairs (atoms, residues, or coarse elements).
ENM analysis then solves the eigenvalue problem:

[
H \mathbf{u}_k = \lambda_k \mathbf{u}_k
]

This is precisely a spectral decomposition of the network’s Laplacian-like operator:

  • ( \mathbf{u}_k ): eigenvectors (normal modes)

  • ( \lambda_k ): eigenvalues (mode stiffness or squared frequency)

So the “spectrum” of an ENM — the ordered list of eigenvalues — represents the vibrational frequency spectrum of the molecular or mechanical network.


2. Connection to Graph Spectral Theory

The ENM is mathematically equivalent to a weighted graph Laplacian:

[
L_{ij} =
\begin{cases}

  • k_{ij} & i \neq j, \
    \sum_{m \neq i} k_{im} & i = j
    \end{cases}
    ]

where ( k_{ij} ) are spring constants.
In this form:

  • (L) is symmetric and positive semidefinite.

  • Its eigenvectors describe collective deformation patterns.

  • Its eigenvalues describe mode stiffness (λ) or oscillation frequencies (ω²).

This is directly analogous to spectral graph theory, where eigenvectors of (L) define smooth “vibrations” over a network — the same concept used in graph signal processing and diffusion geometry.


3. Low-Frequency Spectrum → Global Collective Motion

In ENM:

  • The lowest nonzero eigenvalues correspond to soft, large-scale motions (e.g., hinge bending in proteins).

  • The high-frequency spectrum corresponds to local vibrations (e.g., bond stretching).

Spectral analysis isolates these frequencies, enabling dimensionality reduction:
Instead of using 3N coordinates, ENM keeps only the first ~20–50 eigenmodes — just as in principal component analysis (PCA) or Fourier decomposition.

This is why ENM-based normal mode analysis (NMA) is sometimes called “spectral mode decomposition of structure.”


4. Spectral Analogies in Other Domains

Field Operator Eigenvectors represent ENM Equivalent
Quantum mechanics Schrödinger operator stationary states normal modes
Graph theory Laplacian diffusion patterns elastic vibrations
Image processing Graph Laplacian texture or segmentation basis molecular domain partitioning
Mechanics Stiffness/mass matrices vibration shapes residue motion
Data science Covariance matrix PCA directions soft collective modes

Thus, the ENM spectrum is analogous to a vibrational fingerprint or energy landscape basis — widely used for reduced modeling, clustering, and dynamics inference.


5. Spectral Quantities Extracted from ENM

Spectral Quantity Meaning / Use
Eigenvalues ((λ_k)) Mode stiffness or frequency; related to energy curvature
Eigenvectors ((u_k)) Collective motion directions
Spectral density Distribution of stiffness over frequency bands
Spectral gap Rigidity vs flexibility contrast; useful for detecting modular domains
Participation ratio Localization measure of each mode
Mode overlap Projection between observed conformational changes and ENM modes
Spectral entropy Quantifies complexity of the motion spectrum

6. Spectral Applications in ENM Research

  1. Dimensional reduction – Keep the top 10–20 softest modes to represent conformational changes efficiently.

  2. Domain decomposition – Detect flexible regions by analyzing spectral gaps or nodal structure of eigenmodes.

  3. Allosteric communication – Use mode correlation spectra to identify dynamic coupling between residues.

  4. Multiscale analysis – Compare spectra across coarse-grained and fine-grained ENMs to infer scale invariance.

  5. AI/ML integration – Use ENM eigenvalues as spectral graph features for learning protein motion embeddings.


7. Summary Connection

Concept ENM Term Spectral Analogy
Stiffness matrix (H) Elastic coupling Graph Laplacian
Normal modes Eigenvectors Basis functions
Mode frequencies Eigenvalues Spectrum
Collective motion Low-frequency subspace Smooth eigenfunctions
Rigidity transition Spectral gap Connectivity change

Key Insight:
The Elastic Network Model is not just similar to spectral analysis — it is a spectral analysis of molecular mechanics.
It’s a graph-based eigen-decomposition of the protein’s elastic energy landscape.


Would you like me to show a small numerical example (e.g., a 5-node ENM stiffness matrix and its spectral decomposition) to visualize how the eigenmodes correspond to motion patterns? 

Friday, October 31, 2025

ODU Google MonarchSphere

 

https://www.odu.edu/article/old-dominion-university-and-google-launch-a-first-of-its-kind-ai-incubator-for-higher


https://www.meritalkslg.com/articles/old-dominion-university-launches-monarchsphere-ai-incubator/?utm_source=chatgpt.com


  • Brian O. Hemphill, Ph.D. – President of Old Dominion University

  • Matthew Schneider – Managing Director, National, U.S. State, Local, and Education, Google

  • Tony Orlando (B.S. ’91, MBA ’98) – Managing Director, Partner Ecosystem and Specialty Solutions, Google

  • Chrysoula Malogianni, Ph.D. – Senior Associate Vice President for Digital Innovation, ODU

  • Nina Rodriguez Gonser – Vice President for Digital Transformation and Technology, ODU

  • The Hon. Don Scott – Speaker of the Virginia House of Delegates

  • The Hon. L. Louise Lucas – President Pro Tempore, Senate of Virginia

  • Wednesday, October 29, 2025

    gtc day 2

     

    meet up, pathogen AI, 

    microbial, multi modal



    GWU jame hahn AI, pediatrics

     GWU jame hahn AI, pediatrics

    https://engineering.gwu.edu/james-hahn


    https://github.com/Ezharjan/AutoSceneGen

     

    https://ezharjan.github.io/AutoSceneGen/

    https://github.com/Ezharjan/AutoSceneGen


    https://github.com/Ezharjan?tab=repositories



    Tuesday, October 28, 2025

    gtc day 1

     https://www.nvidia.com/en-us/high-performance-computing/earth-2/


    AI to biology is like math to physics


    george church, dyno, manifold, 

    Company Focus / AI-relevance Notes
    Lila Sciences AI-agent platform startup George Church joined as Chief Scientist in 2025 for Lila Sciences, which is specifically described as an “AI agent platform” startup. (Wikipedia)
    GC Therapeutics Cell-therapy company, includes machine-learning / “plug & play” tech Although primarily a biotech/cell-therapy company, GC Therapeutics uses a platform that combines synthetic biology, gene editing, cell engineering and machine learning. (Fierce Biotech)


    david baker, cofounder


    aridunah duvan, ceo 

    peter korte, simens 

    bret, figure AI

    foxxconn, Liu, ceo


    CPU general purpose to GPU accelerated computing platform shift

    reinvent new algorithms

    CUDA-X libraries, 300+

    medical imaging framework

    genomic processing


    codesign 

    mixture of expert on the chip, thinking gpu, 32 expert gpus




    Thursday, October 23, 2025

    20251023Thu transformer, part 2

      == pre-class to do: 

    post video:

    calendar email invitation: 

    homework assignment, data camp, 

    socrative sign in

    update Canvas course materials, update learning objectives. assignments as needed:

    Test-run code: skip. 

    kindle book. using ipad to highlight key points. 


    == In-class to do: 

    clean up destktop space, calendars, 

    ZOOM, live transcript (start video recording). 

    Socrative sign in, 

    transformer gpt.ipynb on vertex ai, 

     decoder only

     causal masks, 

     temperature 


    student presentation: Google Day practice

    breakout rooms:   course project. 


    How Alicia Jackson Could Redefine ARPA-H’s AI Future

     https://www.politico.com/newsletters/future-pulse/2025/10/22/arpa-h-new-director-00618080


    How Alicia Jackson Could Redefine ARPA-H’s AI Future


    Alicia Jackson’s DARPA roots could profoundly reshape how ARPA-H approaches artificial intelligence — especially generative AI in the life sciences.


    While POLITICO reports that the Trump administration cut several ARPA-H AI programs in areas like AI-driven cancer detection and preventive care, that doesn’t mean a retreat from AI. It signals a strategic pivot — from broad, exploratory projects to mission-focused, biologically grounded applications.


    🔹 1. From Algorithms to Living Systems


    At DARPA’s Biological Technologies Office, Jackson led visionary programs such as Living Foundries and BRICS, advancing programmable biology and biosafety.

    Her philosophy treats AI as a design engine — a tool for creating biological systems, not just interpreting them.

    Jackson’s Focus

    How Generative AI Fits In

    Programmable biology

    AI models that design new enzymes, antibodies, or pathways.

    Biomanufacturing efficiency

    Reinforcement learning to optimize cell or microbial production.

    Predictable, controllable systems

    AI that forecasts biological stability and detects anomalies in real time.


    🔹 2. Translating AI into the Real World


    Jackson’s entrepreneurial work — from Evernow to Drawbridge Health — points to a leader focused on translation and commercialization.

    Expect ARPA-H to favor AI that accelerates real-world deployment, not theoretical modeling.


    Likely directions:

    • Digital biomanufacturing twins for faster FDA qualification

    • Human-in-the-loop generative design for explainable AI innovation

    • Regulatory-ready AI models aligned with FDA’s evolving digital-health framework


    🔹 3. Safety, Robustness, and Governance at the Core


    Jackson’s history with Safe Genes and BRICS highlights her awareness of biosecurity and dual-use risks.

    Her ARPA-H will likely push for “safe and governed” AI, emphasizing:

    • Explainable generative models for biology

    • Ethical-control frameworks for AI that manipulates living systems

    • Red-teaming and validation pipelines — directly inspired by DARPA safety protocols


    In practice, that means generative tools will need built-in containment logic to prevent unintended or dangerous outputs.


    🔹 4. What Future ARPA-H AI Projects Might Look Like

    ARPA-H Priority Area

    AI Application Example

    Strategic Outcome

    Rapid Bio-Design Platforms

    Foundation models for proteins and RNA

    Faster molecule discovery for health and defense

    Scalable Biomanufacturing

    Generative control of microbial or cell-free systems

    On-demand vaccines, hormones, or nutrients

    Neuro-Restoration Interfaces

    Generative neural encoding

    Brain recovery and adaptive prosthetics

    Women’s Health & Aging

    Personalized AI for hormonal and aging biomarkers

    Precision-health insights with consumer impact

    AI Safety in Biotechnology

    Red-team and governance frameworks

    Mitigate dual-use and biosecurity risks


    🔹 5. The Bigger Picture


    Under Jackson’s leadership:

    • AI won’t vanish — it will integrate deeply into bioengineering.

    • Generative AI will fund tangible biological prototypes, not abstract tools.

    • Open-ended “AI-for-everything” research will give way to DARPA-style challenges — measurable, outcome-driven, and safety-conscious.


    In short, ARPA-H’s next AI chapter will likely merge engineering discipline with biological imagination — turning AI into a creative partner for the life sciences, not just an observer.


    Tuesday, October 21, 2025

    synapse request access

     

    AD Knowledge Portal

    https://www.synapse.org/Synapse:syn31512863

    Thursday, October 16, 2025

    influenza virus, known as D/HY11,

     A team of researchers at the Changchun Veterinary Research Institute in China has zeroed in on a specific strain of the influenza virus, known as D/HY11, which emerged in cattle in northeast China in 2023.


    model the evolution and specici jump using viralGPT. 

    policy map

     policy map

    policymap.com

    https://www.policymap.com/

    maternal vulnerability

     maternal vulnerability

    https://mvi.surgohealth.com/

    BMS program

     

    BMS meeting. 


     Robert Bruno, 3D bioprint and cancer

     Lifang Yang

     Larry Sanford

     Frank Lattanzio

     Patrick Sachs

     Siqi Guo

     Ebony Clark

     Lisa Shollengerger

     Peter Mollica


    • Lifang Yang: Focuses on fundamental and translational cancer research, specifically cancer pathogenesis, biomarker development, and therapeutic approaches. Her lab studies tumor cells, tumor microenvironment, extracellular vesicles, proteomics, and cancer disparities to advance precision oncology through multi-omics and bioinformatics approaches.
    • Frank Lattanzio: Connected to bioelectrics research, including applications of nanosecond pulsed electric fields (nsPEFs) in cancer treatment and cellular electropermeabilization mechanisms.
    • Patrick Sachs: Associated with biomedical and translational sciences, likely focusing on biomedical engineering and tumor microenvironment studies combined with 3D bioprinting and cancer research models.
    • Siqi Guo: Involved in bioelectric research, including DNA vaccination delivery, electrotransfer, and electroporation-mediated gene transfer techniques; associated with cellular and molecular response to electric fields.
    • Siqi Guo is a Research Associate Professor affiliated with the Frank Reidy Research Center for Bioelectrics. His grants include a commercial contract worth about $102,900 (2016-2017) for studying nanosecond electric pulses (NSEPS) as an ablation-immunotherapy for advanced pancreatic cancer. His work focuses on cancer, biotechnology, and bioelectric therapies like nano-pulse stimulation and gene electrotransfer for cancer treatment. He leads projects advancing novel immunotherapy techniques based on electric pulse technology.
    • Lifang Yang, listed as an instructor at Eastern Virginia Medical School collaborating with ODU, shares in recent multidisciplinary seed funding ($42,000) supporting research on synergistic effects of nano-pulse treatment combined with cold plasma reactive species for cancer treatment. Her expertise centers on cancer pathogenesis, tumor microenvironment, extracellular vesicles, and biomarker discovery aiming to improve precision oncology through multi-omics and bioinformatics.


    Tuesday, October 14, 2025

    20251016Thu Transformer

    == pre-class to do: 

    post video

    calendar email invitation: 

    socrative sign in

    update Canvas course materials, update learning objectives. assignments as needed:

    kindle book. 

    == In-class to do: 

    clean up destktop space, calendars, 

    ZOOM, live transcript (start video recording). 

    ** Google Day presentation; 

    Socrative sign in, 

    student presentation; 

    breakout rooms:   course project poster work; 


    Thursday, October 2, 2025

    lect 6, engery based model

       == pre-class to do: 

    post video of lect 4 LSTM

    calendar email invitation: 

    homework assignment, data camp, 

    socrative sign in

    update Canvas course materials, update learning objectives. assignments as needed:

    Test-run code: skip. 

    kindle book. using ipad to highlight key points. 


    == In-class to do: 

    clean up destktop space, calendars, 

    ZOOM, live transcript (start video recording). 

    Socrative sign in, 

    student presentation; Terry and ?? 

    breakout rooms:   course project


    Monday, September 29, 2025

    NVIDIA Accelerates Robotics Research and Development With New Open Models and Simulation Libraries

     

    “Humanoids are the next frontier of physical AI, requiring the ability to reason, adapt and act safely in an unpredictable world,” said Rev Lebaredian, vice president of Omniverse and simulation technology at NVIDIA. “With these latest updates, developers now have the three computers to bring robots from research into everyday life — with Isaac GR00T serving as robot’s brains, Newton simulating their body and NVIDIA Omniverse as their training ground.”

    Friday, September 26, 2025

    ODU faculty profile link

     ODU faculty profile link, where profile image and some information can be updated

    https://monarchprofile.odu.edu/

    Thursday, September 25, 2025

    GCP Google Earth Engine

    GCP Google Earth Engine 

    Pratap Ramamurthy


    Jeremy Malczyk

    Ranadheer Mettu


    https://scholarsarchive.byu.edu/etd/10705/


    https://www.cloudskillsboost.google/focuses/75154?catalog_rank=%7B%22rank%22%3A1%2C%22num_filters%22%3A0%2C%22has_search%22%3Atrue%7D&parent=catalog&search_id=54057579


    https://www.eefabook.org/

    lect 5 normalized flow

      == pre-class to do: 

    post video of lect 4 LSTM

    calendar email invitation: 

    homework assignment, data camp, 

    paper selection, high quality, primary research paper. 

    potential project (agentic bioinformatics analysis, agentic lab report?, pretraining of transformer, word embedding)

    socrative questions (questions on contents from last lecture): TF on VAE

    update Canvas course materials, update learning objectives. assignments as needed:

    Test-run code: skip. 

    kindle book. using ipad to highlight key points. 


    == In-class to do: 

    clean up destktop space, calendars, 

    ZOOM, live transcript (start video recording). 

    Socrative sign in, skipped

    Anton: presentation

    Normalized flow

    breakout rooms:   student Kris; 


    Saturday, September 20, 2025

    2025 Pan Atlantic Middle School Olympiad

     https://www.modusponensinstitute.com/

    2025 Pan Atlantic Middle School Olympiad

    The first Middle School Ethics Olympiad in the Western Hemisphere.

     

    November 22nd, 9am PST. 

    Qualify for the International Middle School Olympiad!


    Thursday, September 18, 2025

    lec 4, LSTM

     == pre-class to do: 

    post video of lect 3 GAN

    calendar email invitation: 

    homework assignment, data camp, 

    paper selection, high quality, primary research paper. 

    potential project (agentic bioinformatics analysis, agentic lab report?, pretraining of transformer, word embedding)

    socrative questions (questions on contents from last lecture): TF on VAE

    update Canvas course materials, update learning objectives. assignments as needed:

    Test-run code: skip. 

    kindle book. using ipad to highlight key points. 


    == In-class to do: 

    clean up destktop space, calendars, 

    ZOOM, live transcript (start video recording). 

    Socrative sign in, skipped

    Segio presentation on DoRA

    GAN, principle in pdf, then kindle textbook, 

    breakout rooms:   discuss course projects. 


    Meeting assets for 202510_CS_795_21992_CS 795 gAI Thu Evening are ready! 

    Meeting summary 

    Quick recap

    The professor outlined requirements for a generative AI course project focusing on ethical and legal considerations, with students needing to submit proposals within a week. The discussion covered various aspects of autoregression models, tokenization in NLP, and the structure and operation of LSTM cells, including how temperature parameters affect model behavior and creativity. The session concluded with a student presentation on weight decomposed low-rank adaptation methods and their performance compared to full fine-tuning, followed by an announcement about breakout rooms for project discussions.

    Next steps

    • All students: Submit project proposals with title, team members, brief background, motivation, available datasets, AI approach, references, and resource requirements
    • Students: Limit project teams to a maximum of two members
    • Students: Ensure each team member makes meaningful contributions to the project if working in pairs
    • Students: Contact the ODU GCP team for support if needing high-end GPUs
    • Evan: Proceed with his project on jailbreaking large language models

    Summary

    Generative AI Course Project Requirements

    The professor explained that the course project is the only project in the course and must be related to generative AI, with a focus on ethical and legal considerations. He clarified that the project can involve training discriminators or jailbreaking models, but emphasized that the end result should have a generative aspect. The professor also discussed the project proposal requirements, including team composition, background, data sets, AI approaches, and resource needs, and mentioned that students have one week to submit their proposals.

    Understanding Autoregression and Tokenization

    Hong discussed the autoregression model, highlighting that while ChatGPT is a popular example, there are many other models including LSTM and GRU. He explained the concept of tokenization in natural language processing, noting its importance in splitting text into smaller units for analysis. Hong also described the process of embedding, where tokens are represented by continuous floating-point numbers, often trained in the context of input or output, and how this can lead to interesting and meaningful representations.

    LSTM Cell Operations and Memory

    Hong explained the structure and operation of LSTM (Long Short-Term Memory) cells, focusing on how they differ from traditional recurrent neural networks. He described how LSTMs use a cell state that maintains memory through weighted matrices shared across all time steps, and detailed the four key operations within each LSTM cell: forget, input, cell state update, and output. Hong noted that while LSTMs were a significant improvement over simple RNNs when introduced 28 years ago, they are now considered less efficient than Transformers due to their fixed weight matrices.

    Temperature Parameters in Language Models

    The discussion focused on the implementation and behavior of temperature parameters in language models, particularly LSTM models. Hong explained how temperature affects the stochastic nature of model predictions, with higher temperatures leading to more deterministic outputs and lower temperatures increasing randomness. Hamza and Evan clarified that temperature controls the creativity and randomness of model outputs, with Evan confirming this through research. The group also discussed the limitations of Keras for modifying AI models compared to PyTorch, noting its industrial nature and declining usage.

    Quantum LSTM and Machine Learning

    Hong discussed the evolution and modifications of recurrent neural networks, particularly focusing on the LSTM (Long Short-Term Memory) and its limitations. He explained how a modified version of the recurrent unit, known as the Gated Recurrent Unit (GRU), lacks a cell state and memory, which could potentially make it faster to train but less effective in retaining long-term dependencies. Hong also introduced the concept of quantum machine learning, highlighting a recent development where a classical LSTM was combined with a quantum encoder to create a quantum LSTM. He emphasized the potential for quantum computing to revolutionize machine learning and suggested that future generations might need to learn quantum machine learning, even though it is still in its early stages.

    Student Presentation Break Discussion

    Hong and Sergio discussed the first student presentation of the day, agreeing to take a 5-minute break before resuming at 7:05. Sergio confirmed he was ready to present and successfully shared his screen for the presentation.

    Decomposed Low-Rank Adaptation Techniques

    Sergio presented on weight decomposed low-rank adaptation, introduced by the NVIDIA group, which builds upon LORA (Low-Rank Adaptation) and DORA (Decomposed Low-Rank Adaptation). He explained that while full fine-tuning adjusts all parameters, parameter-efficient methods like LORA and DORA only modify specific components, aiming to replicate full-tuning results with fewer computations. Sergio detailed how DORA decomposes weights into magnitude and direction, allowing independent updates, which leads to a learning pattern closer to full-tuning compared to LORA. Hong and Evan asked clarifying questions about the decomposition process and the implications of the negative slope in the learning trajectory, which Sergio explained as a lack of correlation between magnitude and direction changes. Terry inquired about training time differences between methods, which Sergio did not fully address in the transcript.

    DORA: Parameter-Efficient Model Tuning

    Sergio explained the concept of DORA, a parameter-efficient tuning method that reduces the number of parameters by decomposing the weight matrix, resulting in faster training times compared to full tuning. He highlighted that while DORA introduces some computational cost during tuning, it does not affect model latency during inference. Hong inquired about the meaning of scores in the results, and Sergio clarified that higher scores indicate better performance, though the specific metrics are not clearly defined for generative AI. They also discussed the hyperparameters used, including rank (R), which is a key tuning parameter, and Sergio explained how R is chosen based on the results and dimensions of the weight matrix.

    Parameter-Efficient Model Tuning Discussion

    The group discussed the performance of DORA and LoRA models, focusing on their efficiency and accuracy compared to full fine-tuning. Sergio explained that DORA can achieve similar or better accuracy than full fine-tuning with fewer parameters, while LoRA performs better with higher ranks but requires more computational resources. The team also explored the concept of quantized models, where the pre-trained model is compressed to reduce memory demands. Hong clarified questions about the ranking system and parameter usage, and the group discussed the implications of different parameter-efficient tuning methods on model latency. Finally, Hong announced that breakout rooms would be set up for students to discuss potential course projects, with 10 rooms available for participants to join


    Thursday, September 11, 2025

    lec 3, GAN

     == pre-class to do: 

    post video of lec 2 VAE. 

    calendar email invitation: 

    homework assignment, data camp, 

    paper selection, high quality, primary research paper. 

    potential project (agentic bioinformatics analysis, agentic lab report?, pretraining of transformer, word embedding)

    socrative questions (questions on contents from last lecture): TF on VAE

    update Canvas course materials, update learning objectives. assignments as needed:

    Test-run code: skip. 

    kindle book. using ipad to highlight key points. 


    == In-class to do: 

    clean up destktop space, calendars, 

    ZOOM, live transcript (start video recording). 

    Socrative sign in, review VAE


    == summary, review VAE

    GAN, principle in pdf, then kindle textbook, 

    breakout rooms, 


    Meeting summary 

    Quick recap

    The meeting began with a review session on variational autoencoders, where students demonstrated good understanding of key concepts including the variational loss function and reparameterization trick. The discussion then moved to Generative Adversarial Networks (GANs), covering their fundamental components, mathematical framework, and training processes, including the challenges and advancements in model training. The latter part of the meeting focused on practical aspects, including the implementation of GANs for image generation, the use of Google Cloud Platform resources like Vertex AI for machine learning applications, and guidelines for course presentations and storage of work.

    Next steps

    • There are no action items or next steps identified in the provided content. The text only states that the material reviewed was an educational presentation about GANs without any action items being assigned.

    Summary

    Variational Autoencoder Review Session

    Hong led a review session on variational autoencoders, confirming that the encoder maps input data to a single latent vector with randomness introduced through auxiliary parameters. Students demonstrated good understanding of concepts like the variational loss function, which includes both reconstruction loss and a regularization term (KL divergence), and the reparameterization trick that allows backpropagation through sampling steps. Hong noted that while some students hadn't signed in, there were 9 confirmed participants, and mentioned that AI meeting note-taking tools were being used by many attendees. The session concluded with a brief mention of moving on to Generative Adversarial Networks in the next lecture.

    Understanding Generative Adversarial Networks

    Hong explained the concept of Generative Adversarial Networks (GANs), which involve a discriminator and a generator. The discriminator aims to distinguish between real and fake data, while the generator creates synthetic data to fool the discriminator. The goal is to reach an equilibrium where the discriminator cannot reliably identify fake data, achieving a 50-50 chance of correct classification. Hong also described the mathematical framework of the value function that guides the training process, highlighting the adversarial nature of the optimization procedure.

    Binary Classifier Loss Function Overview

    Hong explained the mathematical foundation of a binary classifier using cross entropy loss, describing how the value function can be expressed in terms of Kullback-Leibler divergence and Jensen-Shannon divergence between real data and generated distributions. He outlined the training process as a two-step procedure: first maximizing the discriminator using the full loss function, and then minimizing the generator using a simplified version of the loss.

    Advancements in Generative Model Training

    Hong discussed the challenges and advancements in training generative models, focusing on the WGAN with gradient penalty as the current state-of-the-art method. He explained the technical details of the WGAN, including its use of the Earth mover's distance and the introduction of the epsilon parameter for balancing real and fake data. Hong also highlighted the practical implementation of the WGAN using a real-world example involving the detection of fake bricks, which was demonstrated using a dataset of Lego bricks.

    Image Generation Model Architecture Overview

    Hong explained the structure of a discriminator and generator model for image generation, noting that the discriminator is a convolutional neural network with a sigmoid output for binary classification, while the generator is similar to a variational autoencoder. Hong outlined the training process, which involves computing binary cross-entropy loss for both the discriminator and generator, and mentioned that the optimizer is specified elsewhere in the code. The discussion touched on the technical details of image expansion methods and the inclusion of noise in the loss function to improve model performance.

    Enhancing GANs with Gradient Penalty

    Hong discussed the implementation and effectiveness of a generative adversarial network (GAN) with a gradient penalty (GP) for image generation. They explained how the GP is calculated and its role in improving the quality of generated images compared to traditional GANs. Hong also introduced the concept of conditional GANs, which concatenate label information to the input and showed that this simple modification can significantly enhance performance.

    Generative AI and Cloud Platforms

    Hong discussed the evolution of generative AI methods, noting that while the generative adversarial network (GAN) approach was a significant milestone in 2014, the field has since shifted with the advent of agentic AI, which allows for more specialized and sophisticated critiques. Hong also addressed the use of Google Cloud Platform (GCP) and Vertex AI for students in the class, explaining that while GCP provides a range of industrial-level AI tools, the Vertex AI environment is still in its early stages and may require further development. Evan pointed out that the current GCP course focuses mainly on knowledge checks rather than practical use, and Hamza inquired about the speed and capabilities of Vertex AI compared to ODU's supercomputers, to which Hong clarified that the platforms serve different purposes and are not directly comparable.

    Vertex AI Service Overview

    Terry demonstrated how to access and use Vertex AI, a Google Cloud service for machine learning and AI applications. He explained the difference between on-premises clusters and cloud resources, emphasizing that Vertex AI provides a managed service for model development, training, and deployment. Terry showed the class how to log into Google Cloud using their ODU student accounts and navigate the Vertex AI interface, highlighting key features like the model garden, Vertex AI studio, notebooks, and deployment options.

    GCP Resources and Presentation Guidelines

    The meeting focused on discussing the use of Google Cloud Platform (GCP) resources for the course, particularly Vertex AI and storage solutions. Terry explained that a shared project exists for the class, but students should be cautious about deleting each other's work. He demonstrated how to use buckets for storage and recommended copying important data to Git if needed. The group discussed potential future changes to permissions and the possibility of creating individual projects for each student. Hong clarified that presentations should be individual, not group projects, and explained the format and content expectations for presentations. The class was reminded to save their work before the semester ends, as resources may be deleted afterward.


    Wednesday, September 3, 2025

    lec 2, gAI, VAE

      == pre-class to do: 

    post video of lec 1.  done

    calendar email invitation: done 

    homework assignment, data camp, 

    paper selection:  

    potential project (agentic bioinformatics analysis, agentic lab report?, pretraining of transformer, word embedding)

    socrative questions (questions on contents from last lecture ): 

    update Canvas course materials, update learning objectives. assignments as needed:

    Test-run code: skip. 


    kindle book. using ipad to highlight key points. 


    == In-class to do: 

    clean up destktop space, calendars, 

    ZOOM, live transcript (start video recording). 

    Socrative sign in 

    => go over assignments, video, datacamp

    => kingma and Weling, 2013 arxiv

    => hqin's proof work

    => further reading, kingma 2019 tutorial

    => play student videos, setup random breakout rooms to discuss presentation papers