iorewbu.blogg.se

Wwdc 2017 videos
Wwdc 2017 videos






Wwdc 2017 videos how to#

Their use is briefly described in an example at the end of the video.Learn how to quickly build interactive prototypes! See how you can test new ideas and improve upon existing ones with minimal time investment and using tools you are already familiar with. This new feature isn't explained in depth but only its 2 overriding principles are presented: the drag sources and the drop points. The VoiceOver basic scrolling is based on a 3 fingers swipe but it can be customized thanks to the accessibilityScroll method belonging to the UIAccessibilityAction protocol. The accessibilityActivationPoint attribute will allow a fine location in the panning gesture. If a VoiceOver user double taps with one finger, holds it and passes it through the panning gesture, a finer adjustment of a slider can be obtained. When you want to change the value provided by elements such as slider or picker in a very fluid way, two methods have to be implemented:Īs soon as the view is selected, a vertical swipe with one finger increases (up) or shrinks (down) the value. In order to limit or make them easier all the user VoiceOver handlings, it's possible to trigger appropriate actions as soon as an element is activated thanks to a double tap. The programming implementation of this feature is detailed in the development part. Many different actions may be attributed to an element thanks to a bunch of accessibilit圜ustomAction.Īs soon as this element is selected, a vertical swipe with one finger will be proposing the choices of possible actions. The notion of container already existed in iOS but VoiceOver couldn't know the kind of container until now. NSAttributedString can be used in order to customize the way to vocalize the label, value and hint accessibility properties.Īmong the provided examples, one deals with the vocalization of a specific element in a foreign language.Īll usable keys can be found on the Apple official documentation.Ī new accessibility typed container is available in iOS 11. Reminder on the UIAccessibility informal protocol fundamentals that will be used during the presentation.Īttributed Accessibility Properties (26:07) # In this part, the Xcode accessibility inspector instrument is used to show basics about an accessibility app audit.Įxamples are provided without explaining in depth the tool itself to which a former Apple session ( 2016 - 407) is dedicated. To enable this feature, go to Accessibility in the Settings to make the activation effective. Very usefull feature for people who can't use SIRI vocally or who are willing to make some requests in a discreet way. The vocalization of a photo description is a new Voice Over feature that is fired thanks to a simple tap with 3 fingers.Ī very simple detection of the context, the faces and their expressions is then vocalized to a visual impaired user. This basic detection that can be done thanks to a simple tap with 3 fingers will vocalize the text to someone who It's now possible to find out if some text is incrusted inside an image. Thereafter, the selection of a title will give rise to the video playback directly at the proper moment. How to gather several elements into a single one? (31:50) How to fill the label and value properties? (29:59) How to define an (in)accessible element? (31:10) Type to SIRI (11:37) ⟹ iOS 11 new featureĪttributed Accessibility Properties (26:07) ⟹ iOS 11 new featureĪccessibility Container Type (27:20) ⟹ iOS 11 new featureĭuring this presentation, the following solutions for accessibility development pitfalls are suggested thanks to a simple application ( take a look at it): Improved photo description (08:01) ⟹ iOS 11 new feature Text detection in image (07:07) ⟹ iOS 11 new feature Various contents and their video timelapse are indicated hereunder: This video available on the official Apple website ( session 215) points out the main iOS11 accessibility features.






Wwdc 2017 videos