Fundamentals of Deep Learning for Multiple Data Types Eğitimi

  • Eğitim Tipi: Classroom / Virtual Classroom / Online
  • Süre: 1 Gün
  • PDF indir
  • Bu eğitimi kendi kurumunuzda planlayabilirsiniz. Bize Ulaşın!

This workshop uses a series of hands-on exercises to teach deep learning techniques for a range of problems involving multiple data types. You will work with widely-used deep learning tools, frameworks, and workflows to perform neural network training on a fully-configured, GPU-accelerated workstation in the cloud. After a quick introduction to deep learning, you will advance to building deep learning applications for image segmentation, sentence generation, and image and video captioning — while simultaneously learning relevant computer vision, neural network, and natural language processing concepts.
At the end of the workshop, you will be able to assess a broad spectrum of problems where deep learning can be applied.

  • Successful completion of ‘Fundamentals of Deep Learning for Computer Vision’ DLI course, or equivalent.
  • Familiarity with basic Python (functions and variables) and prior experience training neural networks is expected.

At the conclusion of the workshop, you will have an understanding of the fundamentals of deep learning and be able to:
  • Implement common deep learning workflows such as image segmentation and text generation.
  • Compare and contrast data types, workflows, and frameworks.
  • Combine deep learning-powered computer vision and natural language processing to start solving sophisticated real-world problems that require multiple input data types.
Why Deep Learning Institute Hands-On Training?
  • Learn how to build deep learning and accelerated computing applications across a wide range of industry segments such as autonomous vehicles, digital content creation, finance, game development, and healthcare
  • Obtain guided hands-on experience using the most widely-used, industry-standard software, tools, and frameworks
  • Gain real-world expertise through content designed in collaboration with industry leaders including the Children’s Hospital Los Angeles, Mayo Clinic, and PwC
  • Earn NVIDIA DLI Certification to demonstrate your subject matter competency and support professional career growth
  • Access content anywhere, anytime with a fully-configured, GPU-accelerated workstation in the cloud
Certification
Upon successful completion of the workshop, participants will receive NVIDIA DLI Certification to recognize subject matter competency and support professional career growth.

Introduction
  • Content overview
  • Get started with deep learning
Introduction to deep learning, situations in which it is useful, key terminology, industry trends, and challenges.
Image Segmentation with TensorFlow
  • Compare image segmentation to other computer vision problems
  • Experiment with TensorFlow tools
  • Implement effective metrics for assessing model performance
Hands-on exercise: Segment MRI images to measure parts of the heart using tools such as TensorBoard and the TensorFlow Python API.
Word Generation with TensorFlow
  • Introduction to natural language processing (NLP) and recurrent neural networks (RNNs)
  • Create network inputs from text data
  • Test with new data
  • Iterate to improve performance
Hands-on exercise: Train a recurrent neural network to understand both images and text, and to predict the next word of a sentence using the Microsoft Common Objects in Context (MSCOCO) dataset.
Image and Video Captioning
  • Combine computer vision and natural language processing to describe scenes
  • Learn to harness the functionality of convolutional neural networks (CNNs) and RNNs
Hands-on exercise: Train a model that generates a description of an image from raw pixel data by combining outputs of multiple networks (CNNs and RNNs) through concatenation and/or averaging.
Summary
  • Summary of key learnings
  • Workshop survey
  • Review of concepts and practical takeaways
Tools, libraries, and frameworks: TensorFlow, TensorBoard.


Egitimlerle ilgili bilgi almak ve diger tum sorulariniz icin bize ulasin!