[TRANSLATE] translate AICS/CMU10-414/MLC/CS224n/CS285 and LHY (#528)

* complete eng_version for deep learning folder

* fix typo

* add english version for machine learning systems

* Update AICS.en.md

Adjust indentation
This commit is contained in:
nzomi 2023-12-14 13:03:37 +08:00 committed by GitHub
parent 2b5f6a0f38
commit 540131ba71
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
6 changed files with 155 additions and 0 deletions

View File

@ -0,0 +1,32 @@
# Intelligent Computing Systems
## Course Overview
- University: University of Chinese Academy of Sciences
- Prerequisites: Computer Architecture, Deep Learning
- Programming Languages: Python, C++, BCL
- Course Difficulty: 🌟🌟🌟
- Estimated Hours: 100+ hours
Intelligent computing systems serve as the backbone for global AI, producing billions of devices annually, including smartphones, servers, and wearables. Training professionals for these systems is critical for China's AI industry competitiveness. Understanding intelligent computing systems is vital for computer science students, shaping their core skills.
Prof. Yunji Chen's course, taught in various universities, uses experiments to provide a holistic view of the AI tech stack. Covering deep learning frameworks, coding in low-level languages, and hardware design, the course fosters a systematic approach.
Personally, completing experiments 2-5 enhanced my grasp of deep learning frameworks. The BCL language experiment in chapter five is reminiscent of CUDA for those familiar.
I recommend the textbook for a comprehensive tech stack understanding. Deep learning-savvy students can start from chapter five to delve into deep learning framework internals.
Inspired by the course, I developed a [simple deep learning framework](https://github.com/ysj1173886760/PyToy) and plan a tutorial. Written in Python, it's code-light, suitable for students with some foundation. Future plans include more operators and potential porting to C++ for balanced performance and efficiency.
## Course Resources
- Course Website[Official Website](https://novel.ict.ac.cn/aics/)
- Course Videos[bilibili](https://space.bilibili.com/494117284)
- Course Textbook"Intelligent Computing Systems" by Chen Yunji
- Course Assignments6 experiments (including writing a convolutional operator, adding operators to TensorFlow, writing operators in BCL and integrating them into TensorFlow, etc.) (specific content can be found on the official website)
- Experiment Manual[Experiment 2.0 Guide Manual](https://forum.cambricon.com/index.php?m=content&c=index&a=show&catid=155&id=708)
- Study Notes<https://sanzo.top/categories/AI-Computing-Systems/>, notes summarized based on the experiment manual
## Resource Compilation
All resources and homework implementations used by @ysj1173886760 in this course are consolidated in [ysj1173886760/Learning: ai-system - GitHub](https://github.com/ysj1173886760/Learning/tree/master/ai-system)

View File

@ -0,0 +1,28 @@
# CMU 10-414/714: Deep Learning Systems
## Course Overview
- University: Carnegie Mellon University (CMU)
- Prerequisites: Introduction to Systems (e.g., 15-213), Basics of Deep Learning,
Fundamental Mathematical Knowledge
- Programming Languages: Python, C++
- Difficulty: 🌟🌟🌟
- Estimated Hours: 100 hours
The rise of deep learning owes much to user-friendly frameworks like PyTorch and TensorFlow. Yet, many users remain unfamiliar with these frameworks' internals. If you're curious or aspiring to delve into deep learning framework development, this course is an excellent starting point.
Covering the full spectrum of deep learning systems, the curriculum spans top-level framework design, autodifferentiation principles, hardware acceleration, and real-world deployment. The hands-on experience includes five assignments, building a deep learning library called Needle. Needle supports automatic differentiation, GPU acceleration, and various neural networks like CNNs, RNNs, LSTMs, and Transformers.
Even for beginners, the course gradually covers simple classification and backpropagation optimization. Detailed Jupyter notebooks accompany complex neural networks, providing insights. For those with foundational knowledge, assignments post autodifferentiation are approachable, offering new understandings.
Instructors [Zico Kolter](https://zicokolter.com/) and [Tianqi Chen](https://tqchen.com/) released open-source content. Online evaluations and forums are closed, but local testing in framework code remains. Hope for an online version next fall.
## Course Resources
- Course Website<https://dlsyscourse.org>
- Course Videos<https://www.youtube.com/watch?v=qbJqOFMyIwg>
- Course Assignments<https://dlsyscourse.org/assignments/>
## Resource Compilation
All resources and assignment implementations used by @PKUFlyingPig in this course are consolidated in [PKUFlyingPig/CMU10-714 - GitHub](https://github.com/PKUFlyingPig/CMU10-714)

View File

@ -0,0 +1,24 @@
# Machine Learning Compilation
## Course Overview
- University: Online course
- Prerequisites: Foundations in Machine Learning/Deep Learning
- Programming Language: Python
--Difficulty: 🌟🌟🌟
- Estimated Hours: 30 hours
This course, offered by top scholar Chen Tianqi during the summer of 2022, focuses on the field of machine learning compilation. As of now, this area remains cutting-edge and rapidly evolving, with no dedicated courses available domestically or internationally. If you're interested in gaining a comprehensive overview of machine learning compilation, this course is worth exploring.
The curriculum predominantly centers around the popular machine learning compilation framework [Apache TVM](https://tvm.apache.org/), co-founded by Chen Tianqi. It delves into transforming various machine learning models developed in frameworks like Tensorflow, Pytorch, and Jax into deployment patterns with higher performance and adaptability across different hardware. The course imparts knowledge at a relatively high level, presenting macro-level concepts. Each session is accompanied by a Jupyter Notebook that provides code-based explanations of the concepts. If you are involved in TVM-related programming and development, this course offers rich and standardized code examples for reference.
All course resources are open-source, with versions available in both Chinese and English. The course recordings can be found on both Bilibili and YouTube in both languages.
## Course Resources
- Course Website<https://mlc.ai/summer22-zh/>
- Course Videos[Bilibili][Bilibili_link]
- Course Notes<https://mlc.ai/zh/index.html>
- Course Assignments<https://github.com/mlc-ai/notebooks/blob/main/assignment>
[Bilibili_link]: https://www.bilibili.com/video/BV15v4y1g7EU?spm_id_from=333.337.search-card.all.click&vd_source=a4d76d1247665a7e7bec15d15fd12349

View File

@ -0,0 +1,27 @@
# CS224n: Natural Language Processing
## Course Overview
- UniversityStanford
- PrerequisitesFundations of Deep Learning + Python
- Programming LanguagePython
- Course Difficulty🌟🌟🌟🌟
- Estimated Hours80 hours
CS224n is an introductory course in Natural Language Processing (NLP) offered by Stanford and led by renowned NLP expert Chris Manning, the creator of the word2vec algorithm. The course covers core concepts in the field of NLP, including word embeddings, RNNs, LSTMs, Seq2Seq models, machine translation, attention mechanisms, Transformers, and more.
The course consists of 5 progressively challenging programming assignments covering word vectors, the word2vec algorithm, dependency parsing, machine translation, and fine-tuning a Transformer.
The final project involves training a Question Answering (QA) model on the well-known SQuAD dataset. Some students' final projects have even led to publications in top conferences.
## Course Resources
- Course Website<http://web.stanford.edu/class/cs224n/index.html>
- Course Videos: Search for 'CS224n' on Bilibili <https://www.bilibili.com/>
- Course TextbookN/A
- Course Assignments<http://web.stanford.edu/class/cs224n/index.html>5 Programming Assignments + 1 Final Project
## Resource Compilation
All resources and assignment implementations used by @PKUFlyingPig during the course are compiled in [PKUFlyingPig/CS224n - GitHub](https://github.com/PKUFlyingPig/CS224n)

View File

@ -0,0 +1,22 @@
# CS285: Deep Reinforcement Learning
## Course Overview
- UniversityUC Berkeley
- PrerequisitesCS188, CS189
- Programming LanguagePython
- Course Difficulty🌟🌟🌟🌟
- Estimated Hours80 hours
The CS285 course, currently taught by Professor Sergey Levine, covers various aspects of deep reinforcement learning. It is suitable for students with a foundational understanding of machine learning, including concepts such as Markov Decision Processes (MDPs). The course involves a substantial amount of mathematical formulas, so a reasonable mathematical background is recommended. Additionally, the professor regularly updates the course content and assignments to reflect the latest research developments, making it a dynamic learning experience.
For course content access, as of the Fall 2022 semester, the teaching format involves pre-recorded videos for students to watch before class. The live sessions mainly focus on Q&A, where the professor discusses selected topics from the videos and answers students' questions. Therefore, the provided course video links already include all the content. The assignments consist of five programming projects, each involving the implementation and comparison of classical models. Occasionally, assignments may also include the reproduction of recent models. The final submission typically includes a report. Given that assignments provide a framework and often involve code completion based on hints, the difficulty level is not excessively high.
In summary, this course is suitable for beginners entering the field of deep reinforcement learning. Although the difficulty increases as the course progresses, it offers a rewarding learning experience.
## Course Resources
- Course Website: <http://rail.eecs.berkeley.edu/deeprlcourse/>
- Course Videos: <https://www.youtube.com/playlist?list=PL_iWQOsE6TfX7MaC6C3HcdOf1g337dlC9>
- Course Texbook: N/A
- Course Assignments: <http://rail.eecs.berkeley.edu/deeprlcourse/>, 5 programming assignments

View File

@ -0,0 +1,22 @@
# National Taiwan University: Machine Learning by Hung-yi Lee
## Course Overview
- University: National Taiwan University
- Prerequisites: Proficiency in Python
- Programming Language: Python
- Course Difficulty: 🌟🌟🌟🌟
- Estimated Hours80 hours
Professor Hung-yi Lee, a professor at National Taiwan University, is known for his humorous and engaging teaching style. He often incorporates fun elements like Pokémon into his slides, making the learning experience enjoyable.
Although labeled as a machine learning course, the breadth of topics covered is impressive. The course includes a total of 15 labs covering Regression, Classification, CNN, Self-Attention, Transformer, GAN, BERT, Anomaly Detection, Explainable AI, Attack, Adaptation, RL, Compression, Life-Long Learning, and Meta Learning. This wide coverage allows students to gain insights into various domains of deep learning, helping them choose areas for further in-depth study.
Don't be overly concerned about the difficulty of the assignments. All assignments come with example code from teaching assistants, guiding students through data processing, model building, and more. Students are required to make modifications based on the provided code. This presents an excellent opportunity to learn from high-quality code, and the assignments serve as valuable resources for those looking to breeze through course projects.
## Course Resources
- Course Websites<https://speech.ee.ntu.edu.tw/~hylee/ml/2022-spring.php>
- Course Videos<https://speech.ee.ntu.edu.tw/~hylee/ml/2022-spring.php>
- Course Textbook: N/A
- Course Assignments<https://speech.ee.ntu.edu.tw/~hylee/ml/2022-spring.php>, 15 labs covering a wide range of deep learning domains