Practical iOS 11: What’s New in iOS 11, Swift 4 and Xcode 9

Practical iOS 11: What’s New in iOS 11, Swift 4 and Xcode 9

English | MP4 | AVC 1280×720 | AAC 48KHz 2ch | 4 Hours | 754 MB

Learn all the new APIs including ARKit (Artificial Reality), Machine Learning and Vision Framework – App Development

iPhone Development Course

What’s New in IOS 11, Swift 4 and Xcode 9

In this course you are going to learn some of the new features added to iOS 11 and Xcode 9. We are going to create multiple apps that focus on each of the new features. Some of these new features include;

New Xcode Features

Xcode has lots of new features and we’ll be taking a look at some of the new exciting features and how to take advantage of them to make you a better coder. These features include structure highlighting, method extraction, easy renaming, github integration etc.

ARKit

Artificial Reality app used to take thousands of lines of code and complex algorithm to achieve but with the new iOS 11 ARkit, it takes only few lines of code to make an AR app.

Machine Learning

Machine learning is gaining traction in the computer industry and there’s no better time to learn machine learning and how to develop an app that take advantage of it. We’ll develop a simple app that can recognise dominant object in a photo and an app that can recognise the faces of the people in a photo

Drag and Drop

iOS 11 allows user to drag and drop content within one app and also between two different apps. We’ll develop a basic app that allows dragging text from one part to another part.

Vision

We’ll develop an app that allows you to track real world object directly through your phone’s camera. This is a nice feature.

Many more.

We’ll be diving into Natural Language Processing, SiriKit, PDFKit, Apple Music integration, new camera APIs, new Files app and so on…

Now’s the best time to learn iOS 11, Swift 4 and Xcode now before the official release of the public version to the users.

Table of Contents

Introduction
1 Introduction to the course
2 Whats new in iOS 11

Whats New in iOS 11 and Xcode 9
3 Introduction
4 Introduction to the project file
5 Add large title to navigation bar
6 Disable large title on a view controller
7 Increase or decrease font size
8 Run multiple simulators at the same time
9 Simulators now have hardware buttons
10 Refactoring – Renamed variables propagates throughout the project
11 Extract methods from a chunk of code
12 Structure highlighting
13 Resize simulator window freely
14 Wireless building
15 Ability to add named colors
16 One click protocol stubs fix

Machine Learning – CoreML – Image Recognition App
17 Introduction
18 Setting up the image detection app user interface
19 Setting up the button click event
20 Download the model from the Apple website
21 Create the model and the request
22 Create the handler and perform the request
23 Request results and result confidence

Machine Learning and Vision – Face Detection App
24 Introduction
25 Design the user interface of the face detection app
26 Set up the UIImageView programmatically
27 Create the next image button click event
28 Create the detect faces function to get the bounding box
29 Create the bounding box for each of the faces
30 More photo samples to test the app

Artificial Reality ARKit App – Placing 3D objects in your environment
31 Introduction
32 Create the user interface for the app
33 Configuration session and setting up the camera usage description
34 Add Apple 3D models into the app
35 Create a Model class
36 Testing the model class and positioning models
37 Create the function to add model and to pass different model names
38 Create the touch began function to get touch position
39 Pick random models and place around the environment

Vision – Real Time Object Tracking Through The Camera
40 Introduction
41 Design the user interface of the object tracking app
42 Create the camera as the input device
43 Create the camera output set up camera permissions and run the camera
44 Adding the delegate method for capturing image frames
45 Vision request using the Inception CoreML model
46 Displaying the result and testing the app to track real world objects

Drag and Drop Feature – Drag Texts To A UITableView
47 Introduction
48 Design the user interface of the drag and drop app
49 Set up the table view correctly
50 Configure the items to drag delegate function
51 Implement a drag preview for lifting delegate function
52 Complete the tableview drop delegate implementation and testing the app

Natural Language Processing App
53 Introduction
54 Designing the UI of the language processing app
55 Detect the dominant language of a text
56 Break words into lexical
57 Tokenization – breaking sentences into words
58 Lemmatization – find the root of any word
59 NameType – detect organization personal and place names in a text

QR Code Reader App
60 Introduction
61 Introduction to the starter project
62 Create the function to perform the image request
63 Create QR Codes and test the app
64 Getting the barcode string value and navigating to the website