![]() |
|
![]() |
|
I played games my whole life and was shocked I had near instant VR motion sickness in sim racing. Can confirm it can be grinded through, recognize the feelings and stop immediately.
|
![]() |
|
I'm not sure why, but I feel like I only get motion sickness in the back of Priuses. It must be something about their braking curve. I don't sit in enough EVs to tell if they're the same. |
![]() |
|
I went on a cruise, and had significant (for me) motion sickness that only got better once I ate --- of course, I was avoiding eating because I didn't feel well, so that seems like the wrong choice.
|
![]() |
|
The image description stuff is already surprisingly good - I noticed when I got a photo text while driving and it described it well enough for me to know what it was.
|
![]() |
|
The same is true with reading out text messages. I’ve disabled it for CarPlay now after receiving a mildly raunchy text with a car full of colleagues. It’s still useful on the headphones though.
|
![]() |
|
This is a good time to remind everyone that tomorrow, May 16th is Global Accessibility Awareness Day (GAAD) (https://accessibility.day), and that there are over 176 events worldwide going on to celebrate the process we are all making at improving accessibility in our products -- any plenty of learning opportunities for beginners and experts.
|
![]() |
|
I wonder if it'll use the same architecture as visionOS; where the vision tracking events and UI affordances are processed and composited out-of-process; with the app never seeing them.
|
![]() |
|
Well, they aren't really limited to accessibility, but they are hidden there. It's sort of like a convenient excuse to get UI designers off your back if you want to ship customization.
|
![]() |
|
Tim Cook in response to a shareholder proposing scrapping accessibility to improve ROI: “When we work on making our devices accessible by the blind, I don’t consider the bloody ROI”. |
![]() |
|
That's just more "cause marketing." The first ever $3T company definitely considers the bloody ROI in every decision -- whether the CEO is willing to admit it publicly or not.
|
![]() |
|
> People can and often do the right thing -- large companies rarely do. There are companies which try (and succeed) to be honest "citizens". Berkshire and Costco have quite good reputations. |
![]() |
|
I read this very skeptically. When I hear eye tracking I immediately think of advertisers targeting me based on what I look at, NOT quadriplegics using Apple devices. Maybe I'm a cynic |
![]() |
|
I made a Shortcut that drops the white point, turns on the red filter, zeroes the audio and turns down the brightness. On wake-up it reverses these. It’s attached to an automation for sleep and wake focus it works really well. I added an explicit run requirement for wake so that I could sleep in without getting blasted in the face with white. There’s a notification with a run button which I can wait until I’m truly up to hit. https://www.icloud.com/shortcuts/4bcfd8fc02074316aaae3503d07... |
![]() |
|
Thanks for the reminder. Wish I had remembered this feature during the aurora photography I was doing. I set the phone brightness to the lowest setting but the red filter would have helped even more.
|
![]() |
|
I think you can just add it to Control Centre, no need for shortcuts.
I made an app for reading in the dark, minimising the amount of light hitting your eyeballs, but I'm still using the red color filter every night. The app is overall darker and more "strict" with how it handles content (esp. images) though: https://untested.sonnet.io/Heart+of+Dorkness and midnight.sonnet.io Overall, reduced the number of photons is a fun metric to play with when building something/messing with prototypes. |
![]() |
|
Kinda feels like you could’ve done more of a cursory glance to see what functionality was actually being talked about before going for the “Android already does this!!” comment.
|
![]() |
|
> Too many simple straightforward options are now strictly inside accessibility. |
![]() |
|
Yes they can, but I don't and I still hate needless animations and turn them off. The point is, why is it in "accessibility" when it should be more visible?
|
![]() |
|
It would only be a violation if it's purely software locked. If it requires a chip that supports specific operations, and entry tier devices have an older chip, that wouldn't be a violation. |
![]() |
|
System one is, but advertisers could always roll their own and see if they can get away with "you can only view this content if you give us permission to use your camera".
|
![]() |
|
three finger dragger for life here. whenever I see colleagues struggling to drag items on a Mac, I show them three finger dragging and it blows them away. total game-changer!
|
![]() |
|
It's better to have an easy way to hold a mouse button via keyboard with your left hand and continue to use one finger on the touchpad rather than do the whole three fingers to drag
|
![]() |
|
If accurate enough it seems like eye tracking could be useful beyond accessibility to eg control iPads without dragging one’s greasy fingers all over the screen. iPad already supports pointers.
|
![]() |
|
macOS has had a version of eye tracking for a while, it's really fun to try out. System preferences -> Accessibility -> Pointer Control Then turn on the "Head pointer" option. |
![]() |
|
That's a separate option. The option above head tracking in those settings allows for adding your facial expressions (smiling, blinking etc) as shortcuts for clicks.
|
![]() |
|
Accessibility is for everyone, including you, if you live long enough. And the alternative is worse. So your choice is death or you are going to use accessibility features. – Siracusa
|
![]() |
|
Yep. And I think there’s an interesting implicit bias where younger / mornjunior developers often get tasked with things like that, so they see no problem
|
![]() |
|
I had a Deaf friend who had dual cochlear implants about 10 years ago. I was blown away to learn they had Bluetooth at the time and she could beam music directly into her brain.
|
![]() |
|
There's a lot of interesting things we can do with haptics since they're relatively cheap to put in stuff. Hopefully accessibility gets the software and applications further along soon
|
![]() |
|
At least I put off the phone while I was in the car. Not the case now. Thank you Apple but I'd rather be sick while looking at your phone in a car.
|
![]() |
|
Some cars already have a voice control button on the wheel for their existing system which, if done correctly, is overriden by Siri+CarPlay. Which is really nice when it works.
|
![]() |
|
It actually improved by no longer being a real button, instead it was a fake indent with a pressure sensor that did a haptic "click". But it still took up a lot of space on the front.
|
![]() |
|
I wonder if this announcement had anything to do with the bombshells OpenAI and Google dropped this week. Couldn’t this have been part of WWDC next month?
|
![]() |
|
Very curious to see how well eye tracking works behind a motorcycle visor. Thick leather gloves, bike noise, and a touch screen and audio interface are not much fun.
|
![]() |
|
It took me a couple read-throughs to understand what you meant, but yes, on-screen gaze-sensitive magnification would be amazing, I agree.
|
![]() |
|
Eye Gaze devices(tablet with camera + software) cost around $20K, even if it offers 1/4 of the features this is good news for those who can't afford it.
|
![]() |
|
Quibble but this isn't "eye tracking" it's "gaze tracking". Eye tracking is detecting where your eyes are. Gaze tracking is what you're looking at.
|
![]() |
|
This is how I feel about "face recognition" (it should mean recognizing whether something is a face or not), but it is common to use eye tracking this way.
|
This excites me so, so much! I can't really use my phone as a passenger in a car without getting motion sick after 1-2 minutes. This seems like it might be a promising thing to try.