Monday, July 31, 2023

UTC CSE adjunct position and training

 ===begin ===

Subject: Adjunct Professor Position: Computer Science & Engineering at UTC

 

The Computer Science and Engineering Department at the University of Tennessee at Chattanooga seeks industry professionals to join as Adjunct Professors. Those with experience in computer science who can provide insights into current industry practices, technology trends, and problem-solving techniques are encouraged to apply.

 

This position offers a chance to teach and mentor in an academic setting. As an Adjunct Professor, you will interact with students interested in the latest industry practices and technologies. Your experience will offer them a perspective grounded in real-world applications.

 

Compensation starts at $1500 per credit hour, and experts with PhD often receive $2000 per credit hour. The role demands clear communication, teaching skills, and a commitment to student learning and development.

 

Interested in contributing to the education of future computer science and IT professionals? Submit your application for the Adjunct Faculty, Computer Science & Engineering, position 16000000D6 at https://ut.taleo.net/careersection/utc_faculty/jobsearch.ftl.

 


adjunct teaching training



 

You will need to complete the FERPA and SSB training.  Link provided below.

 

https://www.utc.edu/enrollment-management-and-student-affairs/registrar/faculty-and-staff-resources/training

 

 

Let me know if you have any questions.

 

Thank you.


Your request for FERPA and Faculty SSB training has been received and the trainings have been assigned to you through Canvas. An email from Canvas has been sent to your UTC email account. Please check your Junk / Spam folder if you do not see it in your Inbox.  You may also access training via the link below. 

Follow the steps below to complete FERPA and Faculty SSB Training: 

1.Complete  FERPA training 

Log into Canvas or select the ‘View in Canvas’ link once you have logged into MyMocsNet

Select course ‘FERPA Training’ 
Select the FERPA Training Module
Select the ‘Load FERPA: Family Educational Rights and Privacy Act in a new window
Start the training

2.Complete Faculty SSB Training through UTC Learn (Canvas)
Log into Canvas or select the ‘View in Canvas’ link once you have logged into MyMocsNet.
Select course ‘Records Training Class’. 
Select Faculty SSB training module.  At the end of the training, you will need to score 80% or higher on the quiz.  If you will also be advising students, please take the Faculty SSB training with Advising quiz.

 

3.Complete the Training Verification Form
Once all trainings have been completed, a 
Training Verification Form for Faculty and Staff will need to be submitted.  You will need to enter the following:

Supervisor’s information
Your Department
 
Select ‘Next’ 

This form will enter a workflow and go to your supervisor for approval.  Once your supervisor approves the form, it will route to Registrar Training where your training is verified and documented. Lastly, it will route to the Faculty Records Specialist.  


 

 


Friday, July 7, 2023

tax exempt documents UTC

UTC tax id: employer identification number 62-6001636

tax exempt document  for Tennessee:

 

http://controller.tennessee.edu/wp-content/uploads/sites/7/2023/06/Tennessee-Sales-and-Use-Tax-Exemption-Certificate-Signed-7.1.23-6.30.27-1.pdf


For other state’s sales tax, here is a list of other exemptions we have:

 

https://controller.tennessee.edu/tax/sales-and-local-tax-exemptions/

 

 

Here is a link to UTC's income tax exempt IRS letter:

 

https://www.utc.edu/sites/default/files/2020-05/inst-fiscal-data-taxletter.pdf

 

It should not matter, but some donors demand to see that UTC are a 501(c)(3) charitable organization. Since UTC is government, we are tax exempt under a different section of the internal revenue code, 107(c)(1). 



But the UC Foundation is a 501(c)(3) charitable organization and we can receive funds through them if the donor requires it. If that is the case, let me know and I will give you more information.

Sunday, July 2, 2023

Shapley Attributed Ablation with Augmented Learning (ShapAAL)

 One new method that improves Shapley values for deep learning explanation is **Shapley Attributed Ablation with Augmented Learning (ShapAAL)**. It is a novel push-pull deep architecture where the subset selection through Shapley value attribution pushes the model to lower dimension while augmented training augments the learning capability of the model over unseen data¹.


ShapAAL demonstrates that a deep learning algorithm with a suitably selected subset of the seen examples or ablating the unimportant ones from the given limited training dataset can ensure consistently better classification performance under augmented training¹.


Would you like more information on this topic?


Source: Conversation with Bing, 7/2/2023

(1) When less is more powerful: Shapley value attributed ablation with .... https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0277975.

(2) Explaining a series of models by propagating Shapley values. https://www.nature.com/articles/s41467-022-31384-3.

(3) [2104.02297] Shapley Explanation Networks - arXiv.org. https://arxiv.org/abs/2104.02297.

(4) GitHub - slundberg/shap: A game theoretic approach to explain the .... https://github.com/slundberg/shap.

flash attention

 FlashAttention is an **IO-aware exact attention algorithm** that uses tiling to reduce the number of memory reads/writes between GPU high bandwidth memory (HBM) and GPU on-chip SRAM. It is designed to be **fast and memory-efficient**¹.


It is optimal for a range of SRAM sizes and requires fewer HBM accesses than standard attention¹. FlashAttention trains Transformers faster than existing baselines and enables longer context in Transformers, yielding higher quality models¹.


Would you like more information on this topic?


Source: Conversation with Bing, 7/2/2023

(1) [2205.14135] FlashAttention: Fast and Memory-Efficient Exact Attention .... https://arxiv.org/abs/2205.14135.

(2) arXiv:2205.14135v2 [cs.LG] 23 Jun 2022. https://arxiv.org/pdf/2205.14135.pdf.

(3) Introducing Lightning Flash — From Deep Learning Baseline To ... - Medium. https://medium.com/pytorch/introducing-lightning-flash-the-fastest-way-to-get-started-with-deep-learning-202f196b3b98.

(4) Attention in Neural Networks - 1. Introduction to attention mechanism. https://buomsoo-kim.github.io/attention/2020/01/01/Attention-mechanism-1.md/.