News
News from the Digital Communication, Web & Web Gis 2.0 World
12 Dec 2024
iOS 18.2: Take a Hearing Test With AirPods Pro 2 - MacRumors
Apple on December 11 released iOS 18.2, which expands the Hearing Test feature on the AirPods Pro 2 to nine additional countries, including Cyprus, Czechia, France, Italy, Luxembourg, Romania, Spain, the United Arab Emirates, and the United Kingdom. The feature first launched in the U.S. and select other countries in late October with iOS 18.1.
Note that Apple's Hearing Test feature is not available in all regions due to differing regulatory laws. Apple maintains a list on its website of regions and territories where the test is available. If your location isn't on the list, you can still take the test – see the last section of this article for details.
What You Need
- AirPods Pro 2 updated with the latest firmware
- iPhone or iPad running iOS/iPadOS 18.1 or later
- A quiet environment
- About 5 minutes of uninterrupted time
The Hearing Test requires that you listen for a comprehensive range of tones at different and sometimes very low volumes. For this reason, it's important to take the test in a quiet environment for the full duration of the test, free from intermittent noise, people talking, or loud air conditioning or fan systems nearby.
Taking the Hearing Test
Make sure your AirPods Pro 2 are sufficiently charged before taking the test.
- Put your AirPods Pro 2 in your ears.
- Open Settings on your iPhone.
- Tap your AirPods Pro name at the top of Settings.
- Under "Hearing Health," tap Take a Hearing Test.
- Answer the preliminary questions about your age and recent loud noise exposure.
- Follow the fit test to ensure your ear tips create a proper seal.
- When the test begins, tap the screen each time you hear a tone.
- Complete the test for both ears (the test will automatically switch sides).
Understanding Your Results
The test measures your hearing in dBHL (decibel hearing level) and provides an easy-to-read classification:
- Up to 25 dBHL: Little to no hearing loss - can hear normal conversation easily
- 26-40 dBHL: Mild hearing loss - can hear normal speech at close range
- 41-60 dBHL: Moderate hearing loss - requires raised voices to understand speech
- 61-80 dBHL: Severe hearing loss - can only hear very loud speech or shouting
You can access your test results anytime in the Health app. Tap Browse ➝ Hearing, then tap Hearing Test Results to view your history. To share your results with healthcare providers, tap Export PDF at the bottom, or tap the Share button to email or save the audiogram.
After the Test
The results of your test can be used to unlock additional AirPods Pro 2 features and options including Media Assist and Hearing Aid mode.
Media Assist
If mild to moderate hearing loss is detected, you can enable Media Assist to optimize audio for your hearing profile:
- Go to Settings ➝ your AirPods Pro
- Scroll down to Hearing Health
- Tap Media Assist.
- Toggle Media Assist on.
- Choose Use Hearing Test Results to apply your recent test data, or Custom Setup to manually adjust settings.
Hearing Aid Mode
Your AirPods Pro 2 can function as basic hearing aids, amplifying conversations and environmental sounds. You can enable this feature in the Settings app under Accessibility ➝ Hearing Devices.
Bear in mind that the Hearing Aid feature isn't available everywhere. Apple maintains a list of regions and territories in which the feature can be accessed.
Taking the Hearing Test in Unsupported Regions
As we mentioned at the top of this article, Apple's Hearing Test feature isn't available in all regions and territories at the time of writing this, but that doesn't mean you can't take the test.
Instead, you can use the following deep link to directly access the test in the Health app: x-apple-health://HearingAppPlugin.healthplugin/HearingTest – tap the link or paste it into Safari on your iPhone or iPad and you will be taken straight to the test (thanks to Reddit user Special_Lake240). Alternatively, download this Apple Hearing Test shortcut, grab your AirPods Pro 2, then run the shortcut on your device to take the test.
After taking the test using this method, your results will be available in the Health app. Just bear in mind that you won't be able to access Apple's other hearing health features like Media Assist and Hearing Test unless they are officially available in your region. Related Roundup: AirPods ProBuyer's Guide: AirPods Pro (Neutral)Related Forum: AirPods
This article, "iOS 18.2: Take a Hearing Test With AirPods Pro 2" first appeared on MacRumors.com
Discuss this article in our forums
iOS 18.2: Take a Hearing Test With AirPods Pro 2 - MacRumors
Apple on December 11 released iOS 18.2, which expands the Hearing Test feature on the AirPods Pro 2 to nine additional countries, including Cyprus, Czechia, France, Italy, Luxembourg, Romania, Spain, the United Arab Emirates, and the United Kingdom. The feature first launched in the U.S. and select other countries in late October with iOS 18.1.
Note that Apple's Hearing Test feature is not available in all regions due to differing regulatory laws. Apple maintains a list on its website of regions and territories where the test is available. If your location isn't on the list, you can still take the test – see the last section of this article for details.
What You Need
- AirPods Pro 2 updated with the latest firmware
- iPhone or iPad running iOS/iPadOS 18.1 or later
- A quiet environment
- About 5 minutes of uninterrupted time
The Hearing Test requires that you listen for a comprehensive range of tones at different and sometimes very low volumes. For this reason, it's important to take the test in a quiet environment for the full duration of the test, free from intermittent noise, people talking, or loud air conditioning or fan systems nearby.
Taking the Hearing Test
Make sure your AirPods Pro 2 are sufficiently charged before taking the test.
- Put your AirPods Pro 2 in your ears.
- Open Settings on your iPhone.
- Tap your AirPods Pro name at the top of Settings.
- Under "Hearing Health," tap Take a Hearing Test.
- Answer the preliminary questions about your age and recent loud noise exposure.
- Follow the fit test to ensure your ear tips create a proper seal.
- When the test begins, tap the screen each time you hear a tone.
- Complete the test for both ears (the test will automatically switch sides).
Understanding Your Results
The test measures your hearing in dBHL (decibel hearing level) and provides an easy-to-read classification:
- Up to 25 dBHL: Little to no hearing loss - can hear normal conversation easily
- 26-40 dBHL: Mild hearing loss - can hear normal speech at close range
- 41-60 dBHL: Moderate hearing loss - requires raised voices to understand speech
- 61-80 dBHL: Severe hearing loss - can only hear very loud speech or shouting
You can access your test results anytime in the Health app. Tap Browse ➝ Hearing, then tap Hearing Test Results to view your history. To share your results with healthcare providers, tap Export PDF at the bottom, or tap the Share button to email or save the audiogram.
After the Test
The results of your test can be used to unlock additional AirPods Pro 2 features and options including Media Assist and Hearing Aid mode.
Media Assist
If mild to moderate hearing loss is detected, you can enable Media Assist to optimize audio for your hearing profile:
- Go to Settings ➝ your AirPods Pro
- Scroll down to Hearing Health
- Tap Media Assist.
- Toggle Media Assist on.
- Choose Use Hearing Test Results to apply your recent test data, or Custom Setup to manually adjust settings.
Hearing Aid Mode
Your AirPods Pro 2 can function as basic hearing aids, amplifying conversations and environmental sounds. You can enable this feature in the Settings app under Accessibility ➝ Hearing Devices.
Bear in mind that the Hearing Aid feature isn't available everywhere. Apple maintains a list of regions and territories in which the feature can be accessed.
Taking the Hearing Test in Unsupported Regions
As we mentioned at the top of this article, Apple's Hearing Test feature isn't available in all regions and territories at the time of writing this, but that doesn't mean you can't take the test.
Instead, you can use the following deep link to directly access the test in the Health app: x-apple-health://HearingAppPlugin.healthplugin/HearingTest – tap the link or paste it into Safari on your iPhone or iPad and you will be taken straight to the test (thanks to Reddit user Special_Lake240). Alternatively, download this Apple Hearing Test shortcut, grab your AirPods Pro 2, then run the shortcut on your device to take the test.
After taking the test using this method, your results will be available in the Health app. Just bear in mind that you won't be able to access Apple's other hearing health features like Media Assist and Hearing Test unless they are officially available in your region. Related Roundup: AirPods ProBuyer's Guide: AirPods Pro (Neutral)Related Forum: AirPods
This article, "iOS 18.2: Take a Hearing Test With AirPods Pro 2" first appeared on MacRumors.com
Discuss this article in our forums
Arrivano conferme per i MacBook Air M4 da macOS 15.2 - TheAppleLounge
Arrivano conferme per i MacBook Air M4 da macOS 15.2 - TheAppleLounge
Everything You Need to Know About Apple Intelligence - MacRumors
Subscribe to the MacRumors YouTube channel for more videos.
Apple Intelligence Features Available Now
Writing Tools
- Proofread text checks for spelling and grammar errors, including word choice and sentence structure. You can accept all suggestions with a tap or go through them one by one with explanations.
- Rewrite cleans up what you've written and shifts the tone without impacting your content. Options include Friendly, Professional, and Concise.
- You can select text and get a summary of it with Apple Intelligence. You can choose to create a paragraph, pull out key points, make a list, or create a table. Summaries are available in Mail, Messages, and more.
- As of iOS 18.2, there is an open-ended Writing Tools option that lets you describe a change you want to make to something you've written. You can choose any mood or writing style that you want, with varying degrees of success.
- Also in iOS 18.2, Writing Tools has a "Compose" feature that uses Siri ChatGPT integration. With this option, Siri can leverage ChatGPT to compose writing from scratch rather than just rewriting text.
You can select any text on your iPhone, iPad, or Mac and use Apple Intelligence to access Writing Tools for summaries and other features.
Siri
- There's a new glow around the edges of the display when Siri is activated, applicable to iPhone, iPad, and CarPlay. On Mac, the Siri window can be placed anywhere. The glow animates responsively to the sound of your voice so you can tell when Siri is listening without interrupting other things you're doing.
- A double tap at the bottom of the display brings up the Type to Siri interface so you can type requests instead of speaking them. On Mac, you need to press the Command key twice to bring up Type to Siri. Type to Siri includes suggested requests so you can get your questions answered faster.
- Siri can maintain context between requests so you can ask a question and then reference it in a second reply. If you ask about the temperature in Raleigh, for example, and then follow up with "what's the humidity?" Siri should know you mean in Raleigh.
- If you stumble over your words when speaking to Siri, or change what you're saying mid-sentence, Siri will follow along.
- Siri has Apple's product knowledge and support base for answering questions about your device's features and settings, and can even find settings when you don't know the exact name by using natural language search.
- There is a summarize button for summarizing any of your incoming emails, plus you will see a brief summary of an email in your inbox list rather than the first few lines of the email.
- Mail surfaces time sensitive messages first when applicable, putting them at the top of your inbox so you see what's important right away.
- Smart Reply provides quick-tap responses to emails that you've been sent, with contextual options based on what's in the email.
- Multiple notifications from Mail will be summarized on your Lock Screen so you can see what's in an email without opening the app.
Messages
- Messages has Smart Reply options for incoming texts, which analyze the content of messages to offer suggestions of what you might want to say.
- Multiple Messages notifications are summarized on your Lock Screen.
- You can use all of the Writing Tools features in the Messages app for proofreading and refining what you're planning to send.
Photos
- You can create a Memory Movie with just a description, such as "My cat in 2024," or "Orlando in the summer." The feature automatically picks relevant photos and chooses songs, but you can tweak through the Memory Mixes feature or choose a mood to guide the direction of the audio. You can also add in specific scenes and images you want to see throughout the memory when you're creating the prompt.
- Natural language search is available in Photos, so you can just describe what you're looking for, such as "Eric rollerskating while wearing green."
- Search can also find specific moments in video clips.
- Search offers up smart complete suggestions for narrowing down what you might want to find.
Clean Up
The Photos app also includes "Clean Up," a feature that lets you remove unwanted objects from your photos. The Clean Up tool in the Photos app is able to automatically detect objects in an image that might not be wanted, but you can also tap, circle, or brush over an unwanted object to remove it.
Zooming in on an image can help with using a finger as a brush to remove smaller blemishes and issues with an image, and it is intelligent enough not to remove part of a person even if a person or main subject is selected.
Clean Up works on all images in the Photos library, including older images and images captured by other devices like a point and shoot camera or a DSLR.
Transcription Summaries
In Notes and other apps, you can record audio and get a transcript along with a summary of your transcript, which is useful for recording lectures and other audio. Transcription isn't an Apple Intelligence feature, but summaries are.
Focus Modes
There is a dedicated Reduce Interruptions Focus Mode that only shows you important notifications that need attention while filtering out everything else.
When customizing an existing Focus mode or creating a new one, there is a new toggle to turn on Intelligent Breakthrough and Silencing. This option allows important notifications to interrupt you, while muting notifications that are not important, which is the same thing that the Reduce Notifications Focus does. You can override the notification settings for specific apps and people, which is how it worked prior to iOS 18.1.
Notification Summaries
Your incoming notifications are summarized so you can see what's new for each app at a glance.
Phone
Summaries of transcriptions generated from your phone calls are supported in iOS 18.1. The iOS 18.1 update adds the option to record a phone call and get a transcription, which is not an Apple Intelligence feature. What does require Apple Intelligence, though, is getting a summary from that transcription without reading through the entire thing.
Note that you can start a recording by tapping on the record button in the upper left corner of the display when on a phone call. All participants are notified that the call is being recorded before the recording starts.
Recorded phone calls are stored in the Notes app, where you can tap in to view a transcript and get a summary generated from that transcript.
Safari
When reading an article in Reader Mode, there is an option to have Apple Intelligence summarize the article for you.
Apple Intelligence Features Coming in iOS 18.2
The next set of Apple Intelligence features will come in iOS 18.2, iPadOS 18.2, and macOS Sequoia 15.2, and this is when Apple plans to release Image Playground, Genmoji, and ChatGPT Siri integration. These features are being beta tested right now.
Apple has a waitlist for Image Playground, Genmoji, and Image Wand, and these features will be rolling out to developers over the coming weeks so that Apple can collect feedback and make improvements to the image generation capabilities before a public launch later this year.
Image Playground
Image Playground can be used to generate images in cartoon-like styles using a text-based prompt. Apple has built-in suggestions and concepts that you can choose from, like costumes, locations, items, and themes.
While Image Playground is a standalone app, it is also integrated into the Messages and Notes app. In these apps, Apple can use context from what you've typed for image generation suggestions.
For inspiration, you can upload or take a photo, and then get a cartoon AI version of it, and you can also use images of friends and family members pulled from the People album in the Photos app.
You can start with a base suggestion or photo and then continue to add to it until you get what you want. You can remove suggestions at any time, and save your favorite creations for use in other apps. Anytime you create something with Image Playground, you'll get multiple options so you can choose the best one.
There are only Animation and Illustration styles for Image Playground, so there is no option for creating photorealistic images.
Image Wand
Image Wand is basically the same thing as Image Playground, but in the Notes app. When you have notes that you've taken, you can circle an empty spot or some text and Image Wand will add a contextually relevant image.
So if you have notes on photosynthesis, you can add in an image of a plant under the sun. Image Wand isn't able to generate complex images, so if you want a picture of the internal structure of mitochondria, you're out of luck, but it can make a stylized image featuring the organelle.
On an iPad, you can draw a rough sketch of what you want to add to your notes with an Apple Pencil, and then use Image Wand to generate something more polished.
Genmoji
Genmoji are custom emoji characters that you can create using a text-based description. If there's an emoji you can't find but need, like a duck eating a sandwich or an alligator skateboarding, Genmoji can make it for you.
Genmoji aren't too far off from Image Playground images, but the generation system tends to want to add a person for a lot of requests. You can choose yourself or a friend or family member, or just use a generic emoji character as your base.
In Messages and other apps, Genmoji behave like emoji, but they're not going to display properly for anyone running an operating system earlier than iOS 18.1, or on an Android device. In iOS 18.1 and iOS 18.2, they work like emoji, but Genmoji are displayed as a blank box and an accompanying full-size image on other versions of iOS.
If you have iOS 18.1 and iOS 18.2 and someone sends a Genmoji, you can long press on it and tap the "Emoji Details" button to see what prompt was used to create it, plus you can add it to your own Genmoji/sticker collection to use.
To create a Genmoji character, tap into the Emoji keyboard and tap on the Emoji with a "+" button next to the search bar. From there, you can type in your idea.
Siri ChatGPT Integration
Siri ChatGPT integration lets Siri hand requests over to OpenAI's ChatGPT. ChatGPT is off by default, but you can turn it on in the Apple Intelligence and Siri section of the Settings app.
If ChatGPT integration is enabled, Siri will consult ChatGPT for complex requests. Complex requests might include creating an image, generating text from scratch, making recipe ideas based on what's in your refrigerator, describing what's in a photo, and more.
Siri will analyze each request to see if it's something that needs to be answered by ChatGPT, but you can also automatically invoke ChatGPT for a request by using a request like "Ask ChatGPT to give me a chocolate chip cookie recipe."
Siri will ask your permission before querying ChatGPT, but there is an option to turn off that extra permission step by toggling off "Confirm ChatGPT Requests" in the ChatGPT section of Settings.
You don't need an account to use ChatGPT, and it is free, but if you have a paid account, you can sign in. If you're not signed in, OpenAI does not store any of your ChatGPT requests, nor is your information used for training ChatGPT. If you sign in, ChatGPT can save a copy of your queries. Apple does not store ChatGPT queries.
ChatGPT can be used with Siri, but it is also integrated into Writing Tools and Visual Intelligence. With Writing Tools, ChatGPT can generate text, and with Visual Intelligence, ChatGPT can answer questions about what the Camera sees.
Visual Intelligence
Visual Intelligence is an iPhone 16 feature that uses the Camera Control button. If you long press it, you can get into Visual Intelligence mode, where the Camera app can be used to identify what's around you.
If you point the camera at a store, for example, you can see ratings, hours, and other information. If you take a photo of an object, you can get more information about the object from ChatGPT, or use it with Google Search to find similar images. The Google Search feature is a good way to search for products that you want to find.
Other Visual Intelligence features include reading text out loud, detecting phone numbers and addresses to add them to the Contacts app, copying text, and summarizing long passages of text.
Writing Tools
Writing Tools is in iOS 18.1, but in iOS 18.2, you can make more open-ended changes to what you've written. You can come up with your own tone changes, so if you want something to be more flowery or elaborate, Writing Tools can make it happen. You can also ask for your text to be converted into a different format, like a poem.
Writing Tools also has ChatGPT integration in iOS 18.2 so if you want to generate text from scratch, you can do so with ChatGPT.
Additional Languages
In iOS 18.2, Apple Intelligence supports localized English in Australia, Canada, New Zealand, South Africa, Ireland, and the UK in addition to U.S. English.
Apple Intelligence Features Coming Later
There are additional Apple Intelligence features that will be coming in updates to iOS 18 in 2025.
Priority Notifications
Priority notifications will show up at the top of your notification stack, so you can get to what's most important first.
Siri
Some initial Siri updates are available in iOS 18.1, such as Siri's new glow that encompasses the edges of the display, and ChatGPT integration is coming in iOS 18.2, but we'll need to wait for iOS 18.3 and iOS 18.4 for additional Siri capabilities. Apple is working on onscreen awareness, personal context, and the ability to take more actions in and across apps.
Onscreen awareness will let Siri take actions when you ask something about what's on your display. If you're looking at a photo and want to message it to your friend Eric, you'll be able to tell Siri to "Send this picture to Eric," and Siri will understand and do it.
Personal context will let Siri do more with your personal data like emails and messages. This is an on-device feature, and it will let Siri learn more about you, who you're communicating with, and how you use your device. Personal context will let you do things like ask Siri to find a specific message, or remind you when you took a photo that you're looking for.
The Siri option to take more actions in and across apps will drastically improve what Siri is capable of. You'll be able to move files from one app to another and control app functions with Siri that you never could before. It'll work in third-party apps as well as Apple's own apps.
macOS Features
Memory Maker is in iOS 18.1 and iPadOS 18.1, but not macOS Sequoia 15.1. Genmoji is in iOS 18.2 and iPadOS 18.2, but not macOS Sequoia 15.2.
Apple plans to release Memory Maker and Genmoji for macOS Sequoia at a later time.
When to Expect More Apple Intelligence Features
More Apple Intelligence features will come in iOS 18.2, iOS 18.3, and iOS 18.4.
We'll get iOS 18.2 in December, likely around the middle of the month. Apple is already testing iOS 18.2 with developers.
In late January or so, we'll get iOS 18.3, which could potentially have some new Siri features.
iOS 18.4, which isn't expected until around March 2025, will have the bulk of the Siri Apple Intelligence features. We're also expecting to see Apple roll out support for additional languages in 2025.
Apple Intelligence Device Requirements
Apple Intelligence requires a device with one of Apple's newest chips and 8GB RAM. Eligible models are listed below.
- All iPhone 16 models
- iPhone 15 Pro and iPhone 15 Pro Max
- All Apple silicon iPads
- A17 Pro iPad mini
- All Apple silicon Macs
Apple Intelligence Waitlist
When you first install iOS 18.1, iPadOS 18.1, and macOS Sequoia 15.1, you have to go to the Settings app and join the Apple Intelligence waitlist. Apple uses a waitlist mechanic to ensure that behind the scenes downloads go smoothly and that the system isn't overloaded.
While on the waitlist, Apple devices download necessary files for on-device processing, and the waitlist should only take a few hours at most. The waitlist is on a per-account basis, so you only need to sign up for it on one device to have access on multiple devices.
Image Playground, Genmoji, and Image Wand in iOS 18.2 require a second waitlist, but that's only relevant if you're running the iOS 18.2 developer beta. If you are, note that Apple is rolling out access to these features over the coming weeks.
Apple Intelligence Settings
In the Privacy and Security section of the Settings app you can access an Apple Intelligence Report that lets you export your Apple Intelligence data as part of Apple's promise for transparency around Apple Intelligence. Biometric authentication is required to access and export Apple Intelligence data.
You can also disable Apple Intelligence by toggling off the setting under the Apple Intelligence and Siri section in the Settings app.
Apple Intelligence Availability
Apple Intelligence is only available in U.S. English at this time, and it is not available to in the European Union (iPhone and iPad) or China. Device region and language need to be set to the United States.
In iOS 18.2, localized English in Australia, Canada, New Zealand, South Africa, and the UK is supported in addition to U.S. English.
Apple plans to add support for additional languages in 2025, like Chinese, English (India), English (Singapore), French, German, Italian, Japanese, Korean, Portuguese, Spanish, and Vietnamese. Apple Intelligence will launch on the iPhone and the iPad in the European Union in April 2025. Related Roundups: iOS 18, iPadOS 18, macOS SequoiaRelated Forums: iOS 18, iPadOS 18, macOS Sequoia
This article, "Everything You Need to Know About Apple Intelligence" first appeared on MacRumors.com
Discuss this article in our forums
Everything You Need to Know About Apple Intelligence - MacRumors
Subscribe to the MacRumors YouTube channel for more videos.
Apple Intelligence Features Available Now
Writing Tools
- Proofread text checks for spelling and grammar errors, including word choice and sentence structure. You can accept all suggestions with a tap or go through them one by one with explanations.
- Rewrite cleans up what you've written and shifts the tone without impacting your content. Options include Friendly, Professional, and Concise.
- You can select text and get a summary of it with Apple Intelligence. You can choose to create a paragraph, pull out key points, make a list, or create a table. Summaries are available in Mail, Messages, and more.
- As of iOS 18.2, there is an open-ended Writing Tools option that lets you describe a change you want to make to something you've written. You can choose any mood or writing style that you want, with varying degrees of success.
- Also in iOS 18.2, Writing Tools has a "Compose" feature that uses Siri ChatGPT integration. With this option, Siri can leverage ChatGPT to compose writing from scratch rather than just rewriting text.
You can select any text on your iPhone, iPad, or Mac and use Apple Intelligence to access Writing Tools for summaries and other features.
Siri
- There's a new glow around the edges of the display when Siri is activated, applicable to iPhone, iPad, and CarPlay. On Mac, the Siri window can be placed anywhere. The glow animates responsively to the sound of your voice so you can tell when Siri is listening without interrupting other things you're doing.
- A double tap at the bottom of the display brings up the Type to Siri interface so you can type requests instead of speaking them. On Mac, you need to press the Command key twice to bring up Type to Siri. Type to Siri includes suggested requests so you can get your questions answered faster.
- Siri can maintain context between requests so you can ask a question and then reference it in a second reply. If you ask about the temperature in Raleigh, for example, and then follow up with "what's the humidity?" Siri should know you mean in Raleigh.
- If you stumble over your words when speaking to Siri, or change what you're saying mid-sentence, Siri will follow along.
- Siri has Apple's product knowledge and support base for answering questions about your device's features and settings, and can even find settings when you don't know the exact name by using natural language search.
- There is a summarize button for summarizing any of your incoming emails, plus you will see a brief summary of an email in your inbox list rather than the first few lines of the email.
- Mail surfaces time sensitive messages first when applicable, putting them at the top of your inbox so you see what's important right away.
- Smart Reply provides quick-tap responses to emails that you've been sent, with contextual options based on what's in the email.
- Multiple notifications from Mail will be summarized on your Lock Screen so you can see what's in an email without opening the app.
Messages
- Messages has Smart Reply options for incoming texts, which analyze the content of messages to offer suggestions of what you might want to say.
- Multiple Messages notifications are summarized on your Lock Screen.
- You can use all of the Writing Tools features in the Messages app for proofreading and refining what you're planning to send.
Photos
- You can create a Memory Movie with just a description, such as "My cat in 2024," or "Orlando in the summer." The feature automatically picks relevant photos and chooses songs, but you can tweak through the Memory Mixes feature or choose a mood to guide the direction of the audio. You can also add in specific scenes and images you want to see throughout the memory when you're creating the prompt.
- Natural language search is available in Photos, so you can just describe what you're looking for, such as "Eric rollerskating while wearing green."
- Search can also find specific moments in video clips.
- Search offers up smart complete suggestions for narrowing down what you might want to find.
Clean Up
The Photos app also includes "Clean Up," a feature that lets you remove unwanted objects from your photos. The Clean Up tool in the Photos app is able to automatically detect objects in an image that might not be wanted, but you can also tap, circle, or brush over an unwanted object to remove it.
Zooming in on an image can help with using a finger as a brush to remove smaller blemishes and issues with an image, and it is intelligent enough not to remove part of a person even if a person or main subject is selected.
Clean Up works on all images in the Photos library, including older images and images captured by other devices like a point and shoot camera or a DSLR.
Transcription Summaries
In Notes and other apps, you can record audio and get a transcript along with a summary of your transcript, which is useful for recording lectures and other audio. Transcription isn't an Apple Intelligence feature, but summaries are.
Focus Modes
There is a dedicated Reduce Interruptions Focus Mode that only shows you important notifications that need attention while filtering out everything else.
When customizing an existing Focus mode or creating a new one, there is a new toggle to turn on Intelligent Breakthrough and Silencing. This option allows important notifications to interrupt you, while muting notifications that are not important, which is the same thing that the Reduce Notifications Focus does. You can override the notification settings for specific apps and people, which is how it worked prior to iOS 18.1.
Notification Summaries
Your incoming notifications are summarized so you can see what's new for each app at a glance.
Phone
Summaries of transcriptions generated from your phone calls are supported in iOS 18.1. The iOS 18.1 update adds the option to record a phone call and get a transcription, which is not an Apple Intelligence feature. What does require Apple Intelligence, though, is getting a summary from that transcription without reading through the entire thing.
Note that you can start a recording by tapping on the record button in the upper left corner of the display when on a phone call. All participants are notified that the call is being recorded before the recording starts.
Recorded phone calls are stored in the Notes app, where you can tap in to view a transcript and get a summary generated from that transcript.
Safari
When reading an article in Reader Mode, there is an option to have Apple Intelligence summarize the article for you.
Apple Intelligence Features Coming in iOS 18.2
The next set of Apple Intelligence features will come in iOS 18.2, iPadOS 18.2, and macOS Sequoia 15.2, and this is when Apple plans to release Image Playground, Genmoji, and ChatGPT Siri integration. These features are being beta tested right now.
Apple has a waitlist for Image Playground, Genmoji, and Image Wand, and these features will be rolling out to developers over the coming weeks so that Apple can collect feedback and make improvements to the image generation capabilities before a public launch later this year.
Image Playground
Image Playground can be used to generate images in cartoon-like styles using a text-based prompt. Apple has built-in suggestions and concepts that you can choose from, like costumes, locations, items, and themes.
While Image Playground is a standalone app, it is also integrated into the Messages and Notes app. In these apps, Apple can use context from what you've typed for image generation suggestions.
For inspiration, you can upload or take a photo, and then get a cartoon AI version of it, and you can also use images of friends and family members pulled from the People album in the Photos app.
You can start with a base suggestion or photo and then continue to add to it until you get what you want. You can remove suggestions at any time, and save your favorite creations for use in other apps. Anytime you create something with Image Playground, you'll get multiple options so you can choose the best one.
There are only Animation and Illustration styles for Image Playground, so there is no option for creating photorealistic images.
Image Wand
Image Wand is basically the same thing as Image Playground, but in the Notes app. When you have notes that you've taken, you can circle an empty spot or some text and Image Wand will add a contextually relevant image.
So if you have notes on photosynthesis, you can add in an image of a plant under the sun. Image Wand isn't able to generate complex images, so if you want a picture of the internal structure of mitochondria, you're out of luck, but it can make a stylized image featuring the organelle.
On an iPad, you can draw a rough sketch of what you want to add to your notes with an Apple Pencil, and then use Image Wand to generate something more polished.
Genmoji
Genmoji are custom emoji characters that you can create using a text-based description. If there's an emoji you can't find but need, like a duck eating a sandwich or an alligator skateboarding, Genmoji can make it for you.
Genmoji aren't too far off from Image Playground images, but the generation system tends to want to add a person for a lot of requests. You can choose yourself or a friend or family member, or just use a generic emoji character as your base.
In Messages and other apps, Genmoji behave like emoji, but they're not going to display properly for anyone running an operating system earlier than iOS 18.1, or on an Android device. In iOS 18.1 and iOS 18.2, they work like emoji, but Genmoji are displayed as a blank box and an accompanying full-size image on other versions of iOS.
If you have iOS 18.1 and iOS 18.2 and someone sends a Genmoji, you can long press on it and tap the "Emoji Details" button to see what prompt was used to create it, plus you can add it to your own Genmoji/sticker collection to use.
To create a Genmoji character, tap into the Emoji keyboard and tap on the Emoji with a "+" button next to the search bar. From there, you can type in your idea.
Siri ChatGPT Integration
Siri ChatGPT integration lets Siri hand requests over to OpenAI's ChatGPT. ChatGPT is off by default, but you can turn it on in the Apple Intelligence and Siri section of the Settings app.
If ChatGPT integration is enabled, Siri will consult ChatGPT for complex requests. Complex requests might include creating an image, generating text from scratch, making recipe ideas based on what's in your refrigerator, describing what's in a photo, and more.
Siri will analyze each request to see if it's something that needs to be answered by ChatGPT, but you can also automatically invoke ChatGPT for a request by using a request like "Ask ChatGPT to give me a chocolate chip cookie recipe."
Siri will ask your permission before querying ChatGPT, but there is an option to turn off that extra permission step by toggling off "Confirm ChatGPT Requests" in the ChatGPT section of Settings.
You don't need an account to use ChatGPT, and it is free, but if you have a paid account, you can sign in. If you're not signed in, OpenAI does not store any of your ChatGPT requests, nor is your information used for training ChatGPT. If you sign in, ChatGPT can save a copy of your queries. Apple does not store ChatGPT queries.
ChatGPT can be used with Siri, but it is also integrated into Writing Tools and Visual Intelligence. With Writing Tools, ChatGPT can generate text, and with Visual Intelligence, ChatGPT can answer questions about what the Camera sees.
Visual Intelligence
Visual Intelligence is an iPhone 16 feature that uses the Camera Control button. If you long press it, you can get into Visual Intelligence mode, where the Camera app can be used to identify what's around you.
If you point the camera at a store, for example, you can see ratings, hours, and other information. If you take a photo of an object, you can get more information about the object from ChatGPT, or use it with Google Search to find similar images. The Google Search feature is a good way to search for products that you want to find.
Other Visual Intelligence features include reading text out loud, detecting phone numbers and addresses to add them to the Contacts app, copying text, and summarizing long passages of text.
Writing Tools
Writing Tools is in iOS 18.1, but in iOS 18.2, you can make more open-ended changes to what you've written. You can come up with your own tone changes, so if you want something to be more flowery or elaborate, Writing Tools can make it happen. You can also ask for your text to be converted into a different format, like a poem.
Writing Tools also has ChatGPT integration in iOS 18.2 so if you want to generate text from scratch, you can do so with ChatGPT.
Additional Languages
In iOS 18.2, Apple Intelligence supports localized English in Australia, Canada, New Zealand, South Africa, Ireland, and the UK in addition to U.S. English.
Apple Intelligence Features Coming Later
There are additional Apple Intelligence features that will be coming in updates to iOS 18 in 2025.
Priority Notifications
Priority notifications will show up at the top of your notification stack, so you can get to what's most important first.
Siri
Some initial Siri updates are available in iOS 18.1, such as Siri's new glow that encompasses the edges of the display, and ChatGPT integration is coming in iOS 18.2, but we'll need to wait for iOS 18.3 and iOS 18.4 for additional Siri capabilities. Apple is working on onscreen awareness, personal context, and the ability to take more actions in and across apps.
Onscreen awareness will let Siri take actions when you ask something about what's on your display. If you're looking at a photo and want to message it to your friend Eric, you'll be able to tell Siri to "Send this picture to Eric," and Siri will understand and do it.
Personal context will let Siri do more with your personal data like emails and messages. This is an on-device feature, and it will let Siri learn more about you, who you're communicating with, and how you use your device. Personal context will let you do things like ask Siri to find a specific message, or remind you when you took a photo that you're looking for.
The Siri option to take more actions in and across apps will drastically improve what Siri is capable of. You'll be able to move files from one app to another and control app functions with Siri that you never could before. It'll work in third-party apps as well as Apple's own apps.
macOS Features
Memory Maker is in iOS 18.1 and iPadOS 18.1, but not macOS Sequoia 15.1. Genmoji is in iOS 18.2 and iPadOS 18.2, but not macOS Sequoia 15.2.
Apple plans to release Memory Maker and Genmoji for macOS Sequoia at a later time.
When to Expect More Apple Intelligence Features
More Apple Intelligence features will come in iOS 18.2, iOS 18.3, and iOS 18.4.
We'll get iOS 18.2 in December, likely around the middle of the month. Apple is already testing iOS 18.2 with developers.
In late January or so, we'll get iOS 18.3, which could potentially have some new Siri features.
iOS 18.4, which isn't expected until around March 2025, will have the bulk of the Siri Apple Intelligence features. We're also expecting to see Apple roll out support for additional languages in 2025.
Apple Intelligence Device Requirements
Apple Intelligence requires a device with one of Apple's newest chips and 8GB RAM. Eligible models are listed below.
- All iPhone 16 models
- iPhone 15 Pro and iPhone 15 Pro Max
- All Apple silicon iPads
- A17 Pro iPad mini
- All Apple silicon Macs
Apple Intelligence Waitlist
When you first install iOS 18.1, iPadOS 18.1, and macOS Sequoia 15.1, you have to go to the Settings app and join the Apple Intelligence waitlist. Apple uses a waitlist mechanic to ensure that behind the scenes downloads go smoothly and that the system isn't overloaded.
While on the waitlist, Apple devices download necessary files for on-device processing, and the waitlist should only take a few hours at most. The waitlist is on a per-account basis, so you only need to sign up for it on one device to have access on multiple devices.
Image Playground, Genmoji, and Image Wand in iOS 18.2 require a second waitlist, but that's only relevant if you're running the iOS 18.2 developer beta. If you are, note that Apple is rolling out access to these features over the coming weeks.
Apple Intelligence Settings
In the Privacy and Security section of the Settings app you can access an Apple Intelligence Report that lets you export your Apple Intelligence data as part of Apple's promise for transparency around Apple Intelligence. Biometric authentication is required to access and export Apple Intelligence data.
You can also disable Apple Intelligence by toggling off the setting under the Apple Intelligence and Siri section in the Settings app.
Apple Intelligence Availability
Apple Intelligence is only available in U.S. English at this time, and it is not available to in the European Union (iPhone and iPad) or China. Device region and language need to be set to the United States.
In iOS 18.2, localized English in Australia, Canada, New Zealand, South Africa, and the UK is supported in addition to U.S. English.
Apple plans to add support for additional languages in 2025, like Chinese, English (India), English (Singapore), French, German, Italian, Japanese, Korean, Portuguese, Spanish, and Vietnamese. Apple Intelligence will launch on the iPhone and the iPad in the European Union in April 2025. Related Roundups: iOS 18, iPadOS 18, macOS SequoiaRelated Forums: iOS 18, iPadOS 18, macOS Sequoia
This article, "Everything You Need to Know About Apple Intelligence" first appeared on MacRumors.com
Discuss this article in our forums
Apple Shares Ad Highlighting Genmoji in iOS 18.2 - MacRumors
The spot features a custom song called "Anything You Like" by The Dare, and it runs through a list of rhyming Genmoji creations like gnome, foam, pink comb, and skeleton made out of chrome, showing each Genmoji as it's mentioned in the song.
Genmoji can be created on an Apple Intelligence-capable iPhone or iPad running the latest version of iOS or iPadOS. Genmoji are crafted in the Messages app from text-based descriptions, and can be shared with friends and family in text messages.Tag: Apple Ads
This article, "Apple Shares Ad Highlighting Genmoji in iOS 18.2" first appeared on MacRumors.com
Discuss this article in our forums
Apple Shares Ad Highlighting Genmoji in iOS 18.2 - MacRumors
The spot features a custom song called "Anything You Like" by The Dare, and it runs through a list of rhyming Genmoji creations like gnome, foam, pink comb, and skeleton made out of chrome, showing each Genmoji as it's mentioned in the song.
Genmoji can be created on an Apple Intelligence-capable iPhone or iPad running the latest version of iOS or iPadOS. Genmoji are crafted in the Messages app from text-based descriptions, and can be shared with friends and family in text messages.Tag: Apple Ads
This article, "Apple Shares Ad Highlighting Genmoji in iOS 18.2" first appeared on MacRumors.com
Discuss this article in our forums
11 Dec 2024
iOS 18.2 Features: Everything New in iOS 18.2 - MacRumors
This guide highlights everything that's new in iOS 18.2.
Apple Intelligence
There are several new Apple Intelligence features in iOS 18.2, including Image Playground and Genmoji. It's a more fun update than iOS 18.1 thanks to the image generation features.
- Image Playground - There's an Image Playground app and Messages integration for creating stylized images based on prompts and images of you and your friends.
- Image Wand - Image Wand is for adding images to what you've written in the Notes app. On iPad, you can use the Apple Pencil to make a rough sketch and have Image Wand turn it into something nicer.
- Genmoji - With Genmoji, you can create custom emoji. It's similar to Image Playground, but on a smaller emoji-sized scale. People that have iOS 18.1 installed will see Genmoji and can even get information about them, but they can only be created in iOS 18.2. In earlier versions of iOS and on Android devices, they show up as an image.
- Siri ChatGPT - Siri can now hand over requests to ChatGPT, a feature that's entirely opt-in. You don't need an account, and anything you ask Siri can be sent to ChatGPT if you want. ChatGPT has more advanced info than Siri and can also generate images and text. If you want to make it simpler to turn requests over to ChatGPT, you can toggle off the Ask Every Time setting so Siri doesn't always need your permission to contact ChatGPT. Siri with ChatGPT integration supports on-screen responses, so you can do things like ask "what's in this photo?" when you're looking at an image.
- Visual Intelligence - On iPhone 16, Visual Intelligence can be used to identify objects and places around you. Long press the Camera Control button to get to Visual Intelligence, and then point your iPhone at something. You can get hours and reviews if the camera is looking at a restaurant, copy text, get text read aloud, search for items on Google, ask ChatGPT questions about objects, and more.
- Writing Tools - The Rewrite Writing Tools feature is no longer limited to three styles. You can describe the tone or content that you want, such as adding more dynamic words or turning an email into a poem.
- Languages - With iOS 18.2, Apple Intelligence supports English in Australia, Canada, New Zealand, South Africa, and the UK in addition to U.S. English, so if you're in Canada, you no longer have to set your iPhone to U.S. English to use Apple Intelligence.
We have a much more in-depth Apple Intelligence guide that goes over all of the available features, which is well worth checking out. We also have individual guides on the new additions in iOS 18.2.
- Everything You Need to Know About Apple Intelligence
- All of the Image Playground Features
- Genmoji in iOS 18.2: Everything You Need to Know
- Siri's Apple Intelligence Features
- Apple Intelligence Image Wand: All the New Features in iOS 18.2
Camera Control
For iPhone 16 users, there's now an option lock your auto exposure and auto focus settings with a light press. The AE/AF Lock toggle can be turned on by going to Settings > Camera > Camera Control.
There is also an option to adjust the double click speed of Camera Control. Options include Default, Slow, and Slower. The adjustment options join prior options to tweak the double light press speed and the light press force.
As mentioned above, Camera Control now supports Visual Intelligence.
The update also adds a new setting for the Camera Control button. In the Settings app under Display & Brightness, there is now a toggle for "Require Screen On." When this setting is turned on, you can only launch the Camera app or a supported third-party camera app with the Camera Control button when the iPhone's screen is turned on.
Mail App Categorization
In the Mail app, there are new built-in categories for separating out important emails, deals, newsletters, transaction emails, and more. Apple Intelligence is required for Mail app categorization.
The Mail app includes bigger pictures for contacts and businesses, and all emails from a single person or source will be pooled together.
You can swap between categories (the new option) and List View (the non-categorization standard view) by tapping on the three dots in the upper right corner of the app. In the Mail section of the Settings app, you may need to go to Notifications > Customize Notifications > and toggle on All Unread Messages if you're using list view, because the other setting only shows you unread messages categorized as "Primary," which can be confusing.
Tweaked Video Player and More in Photos App
Apple added a tweaked video player that takes up more of the screen in iOS 18.2. This change eliminates the thick borders around videos that appear on earlier iOS 18 versions, meaning you no longer have to tap on a video for full-screen playback.
It's now possible to scrub through a video frame-by-frame, plus there is a setting to turn off auto-looping video playback in the Photos section of the Settings app.
When using the Collections views in the Photos app, you can swipe right to go back to the previous view, plus the Favorites album now appears in the Utilities collection along with the Pinned Collections section.
You're also now able to clear Recently Viewed and Recently Shared album history.
Safari
In the Safari section of the Settings app, there's a new "Not Secure Connection Warning" toggle that lets you know if you're visiting a website that does not have a valid SSL certificate for an encrypted connection. It is not advised to send passwords or other sensitive data on a site that pops up a "Not Secure" warning.
Apple's "Not Secure Connection Warning" toggle is not turned on by default. Apple says that HTTPS Priority will upgrade URLs to HTTPS whenever possible.
The Settings app has a new section for managing website data and history, with options for Clear History and Website Data. The Website Data section includes options for exporting data from Safari and importing data from another app into Safari.
Apple added new background images for customizing the Safari start page.
Safari Downloads
The progress of Safari downloads can now be tracked on the iPhone's Lock Screen and Dynamic Island.
Voice Memos
With Voice Memos, two tracks can be layered on top of one another. There's also an option to separate layers and edit layer mixes.
Two-track projects can be imported directly into Logic Pro on the iPhone 16 Pro and iPhone 16 Pro Max.
Find My
Find My has a new option to Share Item Location with an "airline or trusted person" that can help you locate something that you've misplaced. Choosing the Share Item Location option creates a link that lets someone view the location of a lost item when they open the link.
The link can be opened on a non-Apple device, so an iPhone or Mac is not required to provide someone with your item's location. Links expire after a week or when you're reunited with your lost item.
There's also an option to Show Contact Info, which lets any phone or tablet connect to an item to view a website with more information about it, including the phone number and email address of the owner.
Apple Music
Apple Music now supports natural language search. You can search for genres, moods, activity, decades, and more. Examples include "songs about cats," "songs with a vibe," "relaxing songs," "artists similar to Taylor Swift," "sad 80s songs," and "songs about food."
Apple Music now displays the disc number in some albums.
Apple TV
In the Apple TV app, you can also use casual language search terms to find exactly what you're looking for, typing in genres, actors, and moods. Searches like "movies about natural disasters," "movies with cats," "movies with Zendaya," and "exhilarating movies" all bring up relevant results.
The navigation bar in the TV app can be customized, with options to add different apps and Library categories. Existing options like MLS and Apple TV+ cannot be removed.
Podcasts
You can favorite categories in the Podcasts app in iOS 18.2, and see categories in the library for a new way to navigate through shows and episodes.
A new personalized search page in Podcasts suggests the most relevant categories and editorially curated collections tailored to your listening preferences.
Apple News+
For Apple News+ subscribers in the United States, iOS 18.2 adds daily sudoku puzzles. The sudoku puzzles join existing the Crossword, Crossword Mini, and Quartiles offerings. There are three difficulty levels to complete each day, including easy, moderate, and challenging.
Stocks
In the Stocks app, there are now pre-market price quotes for tracking NASDAQ and NYSE tickers prior to when the market opens.
AirPods Pro Hearing Test and Hearing Aid Expansion
The AirPods Pro 2 Hearing Test feature is now available in Cyprus, Czechia, France, Italy, Luxembourg, Romania, Spain, United Arab Emirates, and the UK.
The Hearing Aid feature is available in the United Arab Emirates.
Apple Arcade
In the Apple Arcade section of the App Store, the "All Games" section has a new drop down filter menu and the option to turn off game previews.
EU App Changes
In the European Union, users can now delete core apps that include the App Store, Safari, Messages, Camera, and Photos.
Third-party browser apps in the EU will be able to create web apps for the iPhone's Home screen using their own custom engines when iOS 18.2 launches.
iPadOS 18.2 requires selecting a Default Browser when opening Safari. This is a Digital Markets Act requirement, with Apple adding an updated interface that will let users select a default web browser of their choice from a list of options.
Apple already implemented this change in iOS 18.
Default Apps Section
In the Settings app under "Apps," there's a new "Default Apps" section that can be used to manage your default apps for the iPhone.
There are Default App settings for Email, Messaging, Calling, Call Filtering, Browser App, Passwords and Codes, and Keyboards. In the U.S. and other countries, you can use this section to choose your preferred Email, Call Filtering, Browser, and Passwords, and Keyboard apps. In the EU, there are more options for choosing non-Apple calling and messaging apps.
Volume Limit
In the Sound and Haptics section of the Settings app, there's a new Volume Limit option with a "Limit Maximum Volume" toggle to control how loudly the iPhone speaker can play audio like songs, movies, and other media.
It does not impact phone calls, FaceTime calls, alarms, and other sounds.
Control Center
In Control Center, there's a new quick access option for Type to Siri. Apple has also removed the Satellite control from the Connectivity section, and changed the icon for Adaptive Audio.
Adjusting Camera Control can be done by opening Settings and going to Accessibility > Camera Button.
Settings App Icons
In Dark Mode, the icons in the Settings app have a new, darker look that shows color on a black background rather than icons that are a solid color with white accents.
iPhone Mirroring With Hotspot
iOS 18.2 allows you to use iPhone Mirroring while your iPhone's hotspot connection is being shared with your Mac. Having your Mac connected through Personal Hotspot previously did not allow you to use iPhone Mirroring.
Fitness Shortcut Actions
There are new Fitness app Shortcut actions that you can use when creating a Shortcut, including Open Fitness Settings, Open Award, Open Session History, and Open Trophy Case.
Lock Screen Volume Slider
In iOS 18.2, there's a new option to force the volume control bar to always be visible on the Lock Screen when adjusting sound.
The feature can be enabled in the Accessibility section of the Settings app. Accessibility > Audio and Visual > Always Show Volume Control.
Vehicle Motion Cues
There's an option see Vehicle Motion Cues in the Dynamic Island when the feature is activated. Vehicle Motion Cues are designed to cut down on motion sickness while riding in a vehicle.
Mac Connection
When connecting to a Mac or PC, you can use Face ID to trust a device.
iMessage Reporting
Children in Australia have access to a tool to report iMessages that contain nude photos and videos. Reported images will be reviewed by Apple, and actions could be taken such as disabling the sender's Apple Account or reporting the incident to law enforcement.
The feature will expand globally in the future.
Music Recognition
In iOS 18.2, the Music Recognition feature in the Control Center has a Musical Memories feature that shows you where you were when you identified a song. You will need to long press on the Music Recognition toggle in Control Center, tap on History, and then allow location access to use the feature.
AirDrop
The AirDrop icon in the share sheet now appears dark in Dark Mode, rather than staying white.
Web Restrictions in Utah
Utah residents under the age of 17 will be opted in to web content restrictions that block adult content in iOS 18.2. This is required by Utah law.
Bug Fixes
There are fixes for a couple of notable bugs in iOS 18.2. The update addresses an issue that could cause captured photos to not immediately appear in the All Photos grid.
It also fixes a bug that could cause Night mode photos to appear degraded when capturing long exposures, a problem that impacted the iPhone 16 Pro and iPhone 16 Pro Max.
Security Updates
iOS 18.2 addresses multiple security vulnerabilities, which means it's a good idea to update as soon as you can.
A full list of the vulnerabilities fixed can be found on Apple's website, but it includes updates for the kernel, Passwords, Safari, WebKit, VoiceOver, and more.
Read More
For more detail on the Apple Intelligence features in iOS 18.2, we have a dedicated guide. Our iOS 18 roundup has a list of all of the features in iOS 18 if you want a recap.
This article, "iOS 18.2 Features: Everything New in iOS 18.2" first appeared on MacRumors.com
Discuss this article in our forums
iOS 18.2 Features: Everything New in iOS 18.2 - MacRumors
This guide highlights everything that's new in iOS 18.2.
Apple Intelligence
There are several new Apple Intelligence features in iOS 18.2, including Image Playground and Genmoji. It's a more fun update than iOS 18.1 thanks to the image generation features.
- Image Playground - There's an Image Playground app and Messages integration for creating stylized images based on prompts and images of you and your friends.
- Image Wand - Image Wand is for adding images to what you've written in the Notes app. On iPad, you can use the Apple Pencil to make a rough sketch and have Image Wand turn it into something nicer.
- Genmoji - With Genmoji, you can create custom emoji. It's similar to Image Playground, but on a smaller emoji-sized scale. People that have iOS 18.1 installed will see Genmoji and can even get information about them, but they can only be created in iOS 18.2. In earlier versions of iOS and on Android devices, they show up as an image.
- Siri ChatGPT - Siri can now hand over requests to ChatGPT, a feature that's entirely opt-in. You don't need an account, and anything you ask Siri can be sent to ChatGPT if you want. ChatGPT has more advanced info than Siri and can also generate images and text. If you want to make it simpler to turn requests over to ChatGPT, you can toggle off the Ask Every Time setting so Siri doesn't always need your permission to contact ChatGPT. Siri with ChatGPT integration supports on-screen responses, so you can do things like ask "what's in this photo?" when you're looking at an image.
- Visual Intelligence - On iPhone 16, Visual Intelligence can be used to identify objects and places around you. Long press the Camera Control button to get to Visual Intelligence, and then point your iPhone at something. You can get hours and reviews if the camera is looking at a restaurant, copy text, get text read aloud, search for items on Google, ask ChatGPT questions about objects, and more.
- Writing Tools - The Rewrite Writing Tools feature is no longer limited to three styles. You can describe the tone or content that you want, such as adding more dynamic words or turning an email into a poem.
- Languages - With iOS 18.2, Apple Intelligence supports English in Australia, Canada, New Zealand, South Africa, and the UK in addition to U.S. English, so if you're in Canada, you no longer have to set your iPhone to U.S. English to use Apple Intelligence.
We have a much more in-depth Apple Intelligence guide that goes over all of the available features, which is well worth checking out. We also have individual guides on the new additions in iOS 18.2.
- Everything You Need to Know About Apple Intelligence
- All of the Image Playground Features
- Genmoji in iOS 18.2: Everything You Need to Know
- Siri's Apple Intelligence Features
- Apple Intelligence Image Wand: All the New Features in iOS 18.2
Camera Control
For iPhone 16 users, there's now an option lock your auto exposure and auto focus settings with a light press. The AE/AF Lock toggle can be turned on by going to Settings > Camera > Camera Control.
There is also an option to adjust the double click speed of Camera Control. Options include Default, Slow, and Slower. The adjustment options join prior options to tweak the double light press speed and the light press force.
As mentioned above, Camera Control now supports Visual Intelligence.
The update also adds a new setting for the Camera Control button. In the Settings app under Display & Brightness, there is now a toggle for "Require Screen On." When this setting is turned on, you can only launch the Camera app or a supported third-party camera app with the Camera Control button when the iPhone's screen is turned on.
Mail App Categorization
In the Mail app, there are new built-in categories for separating out important emails, deals, newsletters, transaction emails, and more. Apple Intelligence is required for Mail app categorization.
The Mail app includes bigger pictures for contacts and businesses, and all emails from a single person or source will be pooled together.
You can swap between categories (the new option) and List View (the non-categorization standard view) by tapping on the three dots in the upper right corner of the app. In the Mail section of the Settings app, you may need to go to Notifications > Customize Notifications > and toggle on All Unread Messages if you're using list view, because the other setting only shows you unread messages categorized as "Primary," which can be confusing.
Tweaked Video Player and More in Photos App
Apple added a tweaked video player that takes up more of the screen in iOS 18.2. This change eliminates the thick borders around videos that appear on earlier iOS 18 versions, meaning you no longer have to tap on a video for full-screen playback.
It's now possible to scrub through a video frame-by-frame, plus there is a setting to turn off auto-looping video playback in the Photos section of the Settings app.
When using the Collections views in the Photos app, you can swipe right to go back to the previous view, plus the Favorites album now appears in the Utilities collection along with the Pinned Collections section.
You're also now able to clear Recently Viewed and Recently Shared album history.
Safari
In the Safari section of the Settings app, there's a new "Not Secure Connection Warning" toggle that lets you know if you're visiting a website that does not have a valid SSL certificate for an encrypted connection. It is not advised to send passwords or other sensitive data on a site that pops up a "Not Secure" warning.
Apple's "Not Secure Connection Warning" toggle is not turned on by default. Apple says that HTTPS Priority will upgrade URLs to HTTPS whenever possible.
The Settings app has a new section for managing website data and history, with options for Clear History and Website Data. The Website Data section includes options for exporting data from Safari and importing data from another app into Safari.
Apple added new background images for customizing the Safari start page.
Safari Downloads
The progress of Safari downloads can now be tracked on the iPhone's Lock Screen and Dynamic Island.
Voice Memos
With Voice Memos, two tracks can be layered on top of one another. There's also an option to separate layers and edit layer mixes.
Two-track projects can be imported directly into Logic Pro on the iPhone 16 Pro and iPhone 16 Pro Max.
Find My
Find My has a new option to Share Item Location with an "airline or trusted person" that can help you locate something that you've misplaced. Choosing the Share Item Location option creates a link that lets someone view the location of a lost item when they open the link.
The link can be opened on a non-Apple device, so an iPhone or Mac is not required to provide someone with your item's location. Links expire after a week or when you're reunited with your lost item.
There's also an option to Show Contact Info, which lets any phone or tablet connect to an item to view a website with more information about it, including the phone number and email address of the owner.
Apple Music
Apple Music now supports natural language search. You can search for genres, moods, activity, decades, and more. Examples include "songs about cats," "songs with a vibe," "relaxing songs," "artists similar to Taylor Swift," "sad 80s songs," and "songs about food."
Apple Music now displays the disc number in some albums.
Apple TV
In the Apple TV app, you can also use casual language search terms to find exactly what you're looking for, typing in genres, actors, and moods. Searches like "movies about natural disasters," "movies with cats," "movies with Zendaya," and "exhilarating movies" all bring up relevant results.
The navigation bar in the TV app can be customized, with options to add different apps and Library categories. Existing options like MLS and Apple TV+ cannot be removed.
Podcasts
You can favorite categories in the Podcasts app in iOS 18.2, and see categories in the library for a new way to navigate through shows and episodes.
A new personalized search page in Podcasts suggests the most relevant categories and editorially curated collections tailored to your listening preferences.
Apple News+
For Apple News+ subscribers in the United States, iOS 18.2 adds daily sudoku puzzles. The sudoku puzzles join existing the Crossword, Crossword Mini, and Quartiles offerings. There are three difficulty levels to complete each day, including easy, moderate, and challenging.
Stocks
In the Stocks app, there are now pre-market price quotes for tracking NASDAQ and NYSE tickers prior to when the market opens.
AirPods Pro Hearing Test and Hearing Aid Expansion
The AirPods Pro 2 Hearing Test feature is now available in Cyprus, Czechia, France, Italy, Luxembourg, Romania, Spain, United Arab Emirates, and the UK.
The Hearing Aid feature is available in the United Arab Emirates.
Apple Arcade
In the Apple Arcade section of the App Store, the "All Games" section has a new drop down filter menu and the option to turn off game previews.
EU App Changes
In the European Union, users can now delete core apps that include the App Store, Safari, Messages, Camera, and Photos.
Third-party browser apps in the EU will be able to create web apps for the iPhone's Home screen using their own custom engines when iOS 18.2 launches.
iPadOS 18.2 requires selecting a Default Browser when opening Safari. This is a Digital Markets Act requirement, with Apple adding an updated interface that will let users select a default web browser of their choice from a list of options.
Apple already implemented this change in iOS 18.
Default Apps Section
In the Settings app under "Apps," there's a new "Default Apps" section that can be used to manage your default apps for the iPhone.
There are Default App settings for Email, Messaging, Calling, Call Filtering, Browser App, Passwords and Codes, and Keyboards. In the U.S. and other countries, you can use this section to choose your preferred Email, Call Filtering, Browser, and Passwords, and Keyboard apps. In the EU, there are more options for choosing non-Apple calling and messaging apps.
Volume Limit
In the Sound and Haptics section of the Settings app, there's a new Volume Limit option with a "Limit Maximum Volume" toggle to control how loudly the iPhone speaker can play audio like songs, movies, and other media.
It does not impact phone calls, FaceTime calls, alarms, and other sounds.
Control Center
In Control Center, there's a new quick access option for Type to Siri. Apple has also removed the Satellite control from the Connectivity section, and changed the icon for Adaptive Audio.
Adjusting Camera Control can be done by opening Settings and going to Accessibility > Camera Button.
Settings App Icons
In Dark Mode, the icons in the Settings app have a new, darker look that shows color on a black background rather than icons that are a solid color with white accents.
iPhone Mirroring With Hotspot
iOS 18.2 allows you to use iPhone Mirroring while your iPhone's hotspot connection is being shared with your Mac. Having your Mac connected through Personal Hotspot previously did not allow you to use iPhone Mirroring.
Fitness Shortcut Actions
There are new Fitness app Shortcut actions that you can use when creating a Shortcut, including Open Fitness Settings, Open Award, Open Session History, and Open Trophy Case.
Lock Screen Volume Slider
In iOS 18.2, there's a new option to force the volume control bar to always be visible on the Lock Screen when adjusting sound.
The feature can be enabled in the Accessibility section of the Settings app. Accessibility > Audio and Visual > Always Show Volume Control.
Vehicle Motion Cues
There's an option see Vehicle Motion Cues in the Dynamic Island when the feature is activated. Vehicle Motion Cues are designed to cut down on motion sickness while riding in a vehicle.
Mac Connection
When connecting to a Mac or PC, you can use Face ID to trust a device.
iMessage Reporting
Children in Australia have access to a tool to report iMessages that contain nude photos and videos. Reported images will be reviewed by Apple, and actions could be taken such as disabling the sender's Apple Account or reporting the incident to law enforcement.
The feature will expand globally in the future.
Music Recognition
In iOS 18.2, the Music Recognition feature in the Control Center has a Musical Memories feature that shows you where you were when you identified a song. You will need to long press on the Music Recognition toggle in Control Center, tap on History, and then allow location access to use the feature.
AirDrop
The AirDrop icon in the share sheet now appears dark in Dark Mode, rather than staying white.
Web Restrictions in Utah
Utah residents under the age of 17 will be opted in to web content restrictions that block adult content in iOS 18.2. This is required by Utah law.
Bug Fixes
There are fixes for a couple of notable bugs in iOS 18.2. The update addresses an issue that could cause captured photos to not immediately appear in the All Photos grid.
It also fixes a bug that could cause Night mode photos to appear degraded when capturing long exposures, a problem that impacted the iPhone 16 Pro and iPhone 16 Pro Max.
Security Updates
iOS 18.2 addresses multiple security vulnerabilities, which means it's a good idea to update as soon as you can.
A full list of the vulnerabilities fixed can be found on Apple's website, but it includes updates for the kernel, Passwords, Safari, WebKit, VoiceOver, and more.
Read More
For more detail on the Apple Intelligence features in iOS 18.2, we have a dedicated guide. Our iOS 18 roundup has a list of all of the features in iOS 18 if you want a recap.
This article, "iOS 18.2 Features: Everything New in iOS 18.2" first appeared on MacRumors.com
Discuss this article in our forums
iOS 18.2 Brings Layered Voice Memo Recordings to iPhone 16 Pro - MacRumors
Apple teamed up with Canadian singer songwriter Michael Bublé, country star Carly Pearce, and record producer Greg Wells to demonstrate the feature. The trio recorded Michael Bublé's new song "Maybe This Christmas" with vocals recorded using the Voice Memos app on an iPhone 16 Pro.
"I don't think people realize the critical role Voice Memos on iPhone plays in the creation process for musicians," said Bublé. "And now with Layered Recordings, if an artist has a moment of inspiration, being unencumbered by the traditional studio experience becomes the advantage, not the limitation. It's so typically Apple to build something we didn't know we needed -- and now won't be able to live without."
Once installing iOS 18.2, iPhone 16 Pro and Pro Max users can layer a vocal track on top of an existing instrumental recording, with no headphones needed. Instrumental compositions can be played through the iPhone's speaker while vocals are recorded at the same time using the iPhone 16 Pro microphones.
Apple says that this feature is powered by the A18 Pro chip, using advanced processing and machine learning to isolate the vocal recording. Voice Memos is able to create two individual tracks so users can apply additional mixing and production in apps like Logic Pro.
A variety of background instrumentals like acoustic guitar or piano can be used as the first layer for a recording, and using Logic Pro, artists and producers can send an instrumental music mix as a compressed audio file directly to Voice Memos for layering vocals on top.
Michael Bublé's "Maybe This Christmas" song can be streamed on Apple Music in Spatial Audio.Tag: Voice Memos
This article, "iOS 18.2 Brings Layered Voice Memo Recordings to iPhone 16 Pro" first appeared on MacRumors.com
Discuss this article in our forums
iOS 18.2 Brings Layered Voice Memo Recordings to iPhone 16 Pro - MacRumors
Apple teamed up with Canadian singer songwriter Michael Bublé, country star Carly Pearce, and record producer Greg Wells to demonstrate the feature. The trio recorded Michael Bublé's new song "Maybe This Christmas" with vocals recorded using the Voice Memos app on an iPhone 16 Pro.
"I don't think people realize the critical role Voice Memos on iPhone plays in the creation process for musicians," said Bublé. "And now with Layered Recordings, if an artist has a moment of inspiration, being unencumbered by the traditional studio experience becomes the advantage, not the limitation. It's so typically Apple to build something we didn't know we needed -- and now won't be able to live without."
Once installing iOS 18.2, iPhone 16 Pro and Pro Max users can layer a vocal track on top of an existing instrumental recording, with no headphones needed. Instrumental compositions can be played through the iPhone's speaker while vocals are recorded at the same time using the iPhone 16 Pro microphones.
Apple says that this feature is powered by the A18 Pro chip, using advanced processing and machine learning to isolate the vocal recording. Voice Memos is able to create two individual tracks so users can apply additional mixing and production in apps like Logic Pro.
A variety of background instrumentals like acoustic guitar or piano can be used as the first layer for a recording, and using Logic Pro, artists and producers can send an instrumental music mix as a compressed audio file directly to Voice Memos for layering vocals on top.
Michael Bublé's "Maybe This Christmas" song can be streamed on Apple Music in Spatial Audio.Tag: Voice Memos
This article, "iOS 18.2 Brings Layered Voice Memo Recordings to iPhone 16 Pro" first appeared on MacRumors.com
Discuss this article in our forums
NASA is finishing its first off-world accident report - Popular Science
NASA is about to publish the world’s first off-world aircraft accident investigation. Aside from making history, the report will help the agency plan ahead for the next generation of flying vehicles that will help humanity explore Mars.
NASA engineers only intended the Mars Perseverance Rover’s Ingenuity helicopter to complete a maximum of five experimental test flights over 30 days in 2021. The experimental vehicle, however, proved much more durable than expected. Over nearly three more years, Ingenuity ultimately flew 72 more times, racking up more than two hours of aerial travel and traveling 30 times farther than planned.
[Related: RIP Mars Ingenuity, the ‘little helicopter that could’]
The rotorcraft’s flying career ended on January 18, 2024, however, when a botched landing appeared to fatally damage its blades. But what caused Ingenuity to miscalculate its 72nd flight remained a mystery to NASA. Since then, a collaborative research team from the Jet Propulsion Laboratory (JPL) and AeroVironment have spent months analyzing the available evidence and data.
“When running an accident investigation from 100 million miles away, you don’t have any black boxes or eyewitnesses,” Ingenuity’s first pilot, Håvard Grip, said in the JPL’s December 10th report announcement.
Grip explained that, while there are now multiple possible scenarios given the data, the team believes one explanation is the likeliest for Ingenuity’s landing failure: The aircraft navigation system couldn’t properly calculate its flight trajectory from the sparse information provided by its camera while traveling over relatively smooth Martian ground.
This graphic depicts the most likely scenario for the hard landing of NASA’s Ingenuity Mars Helicopter during its 72nd and final flight on Jan. 18, 2024. High horizontal velocities at touchdown resulted in a hard impact on a sand ripple, which caused Ingenuity to pitch and roll, damaging its rotor blades.Credit: NASA/JPL-Caltech
As JPL explains, the reviewed data indicates the helicopter’s navigation system began to lack enough trackable surface attributes roughly 20 seconds after takeoff. Subsequent photographic analysis suggests Ingenuity’s computer errors generated a high horizontal velocity that exceeded the design limits of its rotor blades. This caused all four blades to snap at their weakest points, while the resulting vibrations tore the remainder of one blade from the copter. It then rolled across the sands after crashing on Mars as an excessive onboard power demand downed Ingenuity’s communications array for roughly six days.
In a sense, Ingenuity’s demise is also a testament to its resiliency. NASA never expected the helicopter to travel as far as that particular area in Jezero Crater on January 18th. Instead of traveling over a rocky terrain covered in plenty of visual coordinating cues as originally designed, Ingenuity was forced to attempt handling a region with steep and comparatively featureless sand ripples.
Despite all this, Ingenuity isn’t totally dead. Since engineers helped reestablish a link from Earth, the downed helicopter’s computer regularly transmits avionics and weather data to the Perseverance rover—information that may help human astronauts one day reach Mars.
In the meantime, NASA is using all this knowledge to plan for future Mars aerial vehicles, some of which may be as much as 20 times heavier than Ingenuity. On December 11th, team members previewed the Mars Chopper rotorcraft project, which would be capable of transporting several pounds of equipment while also autonomously exploring as much as 2 miles Mars per day. For comparison, Ingenuity traveled about 2,310 feet on its longest flight.
Get the Popular Science newsletterBreakthroughs, discoveries, and DIY tips sent every weekday.
Email address Sign up Thank you!By signing up you agree to our Terms of Service and Privacy Policy.
But scaling up isn’t always necessary. Aside from the Mars Chopper, engineers are also working on designs for vehicles that are smaller and lighter craft than Ingenuity’s four-pound, 19-inch-tall frame. According to Ingenuity project manager Teddy Tzanetos, the aircraft’s lifespan and accomplishments shows just how tough tiny packages can be on Mars.
“We’re now approaching four years of continuous operations, suggesting that not everything needs to be bigger, heavier, and radiation-hardened to work in the harsh Martian environment,” he said.
The post NASA is finishing its first off-world accident report appeared first on Popular Science.
These Apple Intelligence Features Aren't Coming Until 2025 - MacRumors
Image Playground
Image Playground can be used to generate images in cartoon-like styles using a text-based prompt. Apple has built-in suggestions and concepts that you can choose from, like costumes, locations, items, and themes.
While Image Playground is a standalone app, it is also integrated into the Messages and Notes app. In these apps, Apple can use context from what you've typed for image generation suggestions.
For inspiration, you can upload or take a photo, and then get a cartoon AI version of it, and you can also use images of friends and family members pulled from the People album in the Photos app.
You can start with a base suggestion or photo and then continue to add to it until you get what you want. You can remove suggestions at any time, and save your favorite creations for use in other apps. Anytime you create something with Image Playground, you'll get multiple options so you can choose the best one.
There are only Animation and Illustration styles for Image Playground, so there is no option for creating photorealistic images.
Image Wand
Image Wand is basically the same thing as Image Playground, but in the Notes app. When you have notes that you've taken, you can circle an empty spot or some text and Image Wand will add a contextually relevant image.
So if you have notes on photosynthesis, you can add in an image of a plant under the sun. Image Wand isn't able to generate complex images, so if you want a picture of the internal structure of mitochondria, you're out of luck, but it can make a stylized image featuring the organelle.
On an iPad, you can draw a rough sketch of what you want to add to your notes with an Apple Pencil, and then use Image Wand to generate something more polished. Image Wand works on both iPhone and iPad.
Genmoji
Genmoji are custom emoji characters that you can create using a text-based description. If there's an emoji you can't find but need, like a duck eating a sandwich or an alligator skateboarding, Genmoji can make it for you.
Genmoji aren't too far off from Image Playground images, but the generation system tends to want to add a person for a lot of requests. You can choose yourself or a friend or family member, or just use a generic emoji character as your base.
In Messages and other apps, Genmoji behave like emoji, but they're not going to display properly for anyone running an operating system earlier than iOS 18.1, or on an Android device. In iOS 18.1 and iOS 18.2, they work like emoji, but Genmoji are displayed as a blank box and an accompanying full-size image on other versions of iOS.
If you have iOS 18.1 and iOS 18.2 and someone sends a Genmoji, you can long press on it and tap the "Emoji Details" button to see what prompt was used to create it, plus you can add it to your own Genmoji/sticker collection to use.
To create a Genmoji character, tap into the Emoji keyboard and tap on the Emoji with a "+" button next to the search bar. From there, you can type in your idea.
Writing Tools
- Proofread text checks for spelling and grammar errors, including word choice and sentence structure. You can accept all suggestions with a tap or go through them one by one with explanations.
- Rewrite cleans up what you've written and shifts the tone without impacting your content. Options include Friendly, Professional, and Concise.
- You can select text and get a summary of it with Apple Intelligence. You can choose to create a paragraph, pull out key points, make a list, or create a table. Summaries are available in Mail, Messages, and more.
- As of iOS 18.2, there is an open-ended Writing Tools option that lets you describe a change you want to make to something you've written. You can choose any mood or writing style that you want, with varying degrees of success.
- Also in iOS 18.2, Writing Tools has a "Compose" feature that uses Siri ChatGPT integration. With this option, Siri can leverage ChatGPT to compose writing from scratch rather than just rewriting text.
You can select any text on your iPhone, iPad, or Mac and use Apple Intelligence to access Writing Tools for summaries and other features.
Siri
- There's a new glow around the edges of the display when Siri is activated, applicable to iPhone, iPad, and CarPlay. On Mac, the Siri window can be placed anywhere. The glow animates responsively to the sound of your voice so you can tell when Siri is listening without interrupting other things you're doing.
- A double tap at the bottom of the display brings up the Type to Siri interface so you can type requests instead of speaking them. On Mac, you need to press the Command key twice to bring up Type to Siri. Type to Siri includes suggested requests so you can get your questions answered faster.
- Siri can maintain context between requests so you can ask a question and then reference it in a second reply. If you ask about the temperature in Raleigh, for example, and then follow up with "what's the humidity?" Siri should know you mean in Raleigh.
- If you stumble over your words when speaking to Siri, or change what you're saying mid-sentence, Siri will follow along.
- Siri has Apple's product knowledge and support base for answering questions about your device's features and settings, and can even find settings when you don't know the exact name by using natural language search.
Siri ChatGPT Integration
Siri ChatGPT integration lets Siri hand requests over to OpenAI's ChatGPT in iOS 18.2. ChatGPT is off by default, but you can turn it on in the Apple Intelligence and Siri section of the Settings app.
If ChatGPT integration is enabled, Siri will consult ChatGPT for complex requests. Complex requests might include creating an image, generating text from scratch, making recipe ideas based on what's in your refrigerator, describing what's in a photo, and more.
Siri will analyze each request to see if it's something that needs to be answered by ChatGPT, but you can also automatically invoke ChatGPT for a request by using a request like "Ask ChatGPT to give me a chocolate chip cookie recipe."
Siri will ask your permission before querying ChatGPT, but there is an option to turn off that extra permission step by toggling off "Confirm ChatGPT Requests" in the ChatGPT section of Settings.
You don't need an account to use ChatGPT, and it is free, but if you have a paid account, you can sign in. If you're not signed in, OpenAI does not store any of your ChatGPT requests, nor is your information used for training ChatGPT. If you sign in, ChatGPT can save a copy of your queries. Apple does not store ChatGPT queries.
ChatGPT can be used with Siri, but it is also integrated into Writing Tools and Visual Intelligence. With Writing Tools, ChatGPT can generate text, and with Visual Intelligence, ChatGPT can answer questions about what the Camera sees.
- There is a summarize button for summarizing any of your incoming emails, plus you will see a brief summary of an email in your inbox list rather than the first few lines of the email.
- Mail surfaces time sensitive messages first when applicable, putting them at the top of your inbox so you see what's important right away.
- Smart Reply provides quick-tap responses to emails that you've been sent, with contextual options based on what's in the email.
- Multiple notifications from Mail will be summarized on your Lock Screen so you can see what's in an email without opening the app.
Messages
- Messages has Smart Reply options for incoming texts, which analyze the content of messages to offer suggestions of what you might want to say.
- Multiple Messages notifications are summarized on your Lock Screen.
- You can use all of the Writing Tools features in the Messages app for proofreading and refining what you're planning to send.
Photos
- You can create a Memory Movie with just a description, such as "My cat in 2024," or "Orlando in the summer." The feature automatically picks relevant photos and chooses songs, but you can tweak through the Memory Mixes feature or choose a mood to guide the direction of the audio. You can also add in specific scenes and images you want to see throughout the memory when you're creating the prompt.
- Natural language search is available in Photos, so you can just describe what you're looking for, such as "Eric rollerskating while wearing green."
- Search can also find specific moments in video clips.
- Search offers up smart complete suggestions for narrowing down what you might want to find.
Clean Up
The Photos app also includes "Clean Up," a feature that lets you remove unwanted objects from your photos. The Clean Up tool in the Photos app is able to automatically detect objects in an image that might not be wanted, but you can also tap, circle, or brush over an unwanted object to remove it.
Zooming in on an image can help with using a finger as a brush to remove smaller blemishes and issues with an image, and it is intelligent enough not to remove part of a person even if a person or main subject is selected.
Clean Up works on all images in the Photos library, including older images and images captured by other devices like a point and shoot camera or a DSLR.
Transcription Summaries
In Notes and other apps, you can record audio and get a transcript along with a summary of your transcript, which is useful for recording lectures and other audio. Transcription isn't an Apple Intelligence feature, but summaries are.
Focus Modes
There is a dedicated Reduce Interruptions Focus Mode that only shows you important notifications that need attention while filtering out everything else.
When customizing an existing Focus mode or creating a new one, there is a new toggle to turn on Intelligent Breakthrough and Silencing. This option allows important notifications to interrupt you, while muting notifications that are not important, which is the same thing that the Reduce Notifications Focus does. You can override the notification settings for specific apps and people, which is how it worked prior to iOS 18.1.
Notification Summaries
Your incoming notifications are summarized so you can see what's new for each app at a glance.
Phone
Summaries of transcriptions generated from your phone calls are supported in iOS 18.1 and later. The iOS 18.1 update adds the option to record a phone call and get a transcription, which is not an Apple Intelligence feature. What does require Apple Intelligence, though, is getting a summary from that transcription without reading through the entire thing.
Note that you can start a recording by tapping on the record button in the upper left corner of the display when on a phone call. All participants are notified that the call is being recorded before the recording starts.
Recorded phone calls are stored in the Notes app, where you can tap in to view a transcript and get a summary generated from that transcript.
Safari
When reading an article in Reader Mode, there is an option to have Apple Intelligence summarize the article for you.
Visual Intelligence
Visual Intelligence is an iPhone 16 feature that uses the Camera Control button. If you long press it, you can get into Visual Intelligence mode, where the Camera app can be used to identify what's around you.
If you point the camera at a store, for example, you can see ratings, hours, and other information. If you take a photo of an object, you can get more information about the object from ChatGPT, or use it with Google Search to find similar images. The Google Search feature is a good way to search for products that you want to find.
Other Visual Intelligence features include reading text out loud, detecting phone numbers and addresses to add them to the Contacts app, copying text, and summarizing long passages of text.
Additional Languages
In iOS 18.2, Apple Intelligence supports localized English in Australia, Canada, New Zealand, South Africa, Ireland, and the UK in addition to U.S. English.
Apple Intelligence Features Coming in 2025
There are additional Apple Intelligence features that will be coming in updates to iOS 18 in 2025.
Priority Notifications
Priority notifications will show up at the top of your notification stack, so you can get to what's most important first.
Siri
Some initial Siri updates are available in iOS 18.1, such as Siri's new glow that encompasses the edges of the display, and ChatGPT integration is coming in iOS 18.2, but we'll need to wait for iOS 18.3 and iOS 18.4 for additional Siri capabilities. Apple is working on onscreen awareness, personal context, and the ability to take more actions in and across apps.
Onscreen awareness will let Siri take actions when you ask something about what's on your display. If you're looking at a photo and want to message it to your friend Eric, you'll be able to tell Siri to "Send this picture to Eric," and Siri will understand and do it.
Personal context will let Siri do more with your personal data like emails and messages. This is an on-device feature, and it will let Siri learn more about you, who you're communicating with, and how you use your device. Personal context will let you do things like ask Siri to find a specific message, or remind you when you took a photo that you're looking for.
The Siri option to take more actions in and across apps will drastically improve what Siri is capable of. You'll be able to move files from one app to another and control app functions with Siri that you never could before. It'll work in third-party apps as well as Apple's own apps.
Sketch Style for Image Playground
As of right now, Image Playground has two different styles to choose from, animation and illustration. There is a third, though, that Apple plans to add in the future. Sketch is described as a "highly detailed and academic" style that "produces gorgeous drawings on stark backgrounds." It is distinct from the illustration style that has strong outlines, bold colors, and simple shapes, and the animation style that has a "whimsical, 3D cartoon look."
The animation, illustration, and sketch styles, respectively
If you want to see what the sketch style looks like, you can check it out in the Image Wand feature in the Notes app, where it is available now.
macOS Features
Memory Maker is in iOS 18.1 and iPadOS 18.1, but not macOS Sequoia 15.1. Genmoji is in iOS 18.2 and iPadOS 18.2, but not macOS Sequoia 15.2.
Apple plans to release Memory Maker and Genmoji for macOS Sequoia at a later time.
More Languages
Apple plans to add support for more languages in 2025, including Chinese, English (India), English (Singapore), French, German, Italian, Japanese, Korean, Portuguese, Spanish, and Vietnamese. Apple says that the first "set" of new languages will come in a software update in April (likely iOS 18.4), while more will come throughout the year.
Apple Intelligence Support in the European Union
Apple plans to start rolling out Apple Intelligence features to iPhone and iPad users in the European Union in April. EU users will officially get access to Writing Tools, Genmoji, the updated Siri with better language understanding, Siri ChatGPT integration, and more.
When to Expect More Apple Intelligence Features
More Apple Intelligence features will come in iOS 18.3 and iOS 18.4.
In late January or so, we'll get iOS 18.3, which could potentially have some new Siri features.
iOS 18.4, which isn't expected until around April 2025, will have the bulk of the Siri Apple Intelligence features. We're also expecting to see Apple roll out support for additional languages in 2025.
Apple Intelligence Device Requirements
Apple Intelligence requires a device with one of Apple's newest chips and at least 8GB RAM. Eligible models are listed below.
- All iPhone 16 models
- iPhone 15 Pro and iPhone 15 Pro Max
- All Apple silicon iPads
- A17 Pro iPad mini
- All Apple silicon Macs
Apple Intelligence Settings
In the Privacy and Security section of the Settings app you can access an Apple Intelligence Report that lets you export your Apple Intelligence data as part of Apple's promise for transparency around Apple Intelligence. Biometric authentication is required to access and export Apple Intelligence data.
You can also disable Apple Intelligence by toggling off the setting under the Apple Intelligence and Siri section in the Settings app.
Apple Intelligence Availability
Apple Intelligence is only available in U.S. English at this time, and it is not available to in the European Union (iPhone and iPad) or China. Device region and language need to be set to the United States.
In iOS 18.2, localized English in Australia, Canada, New Zealand, South Africa, and the UK is supported in addition to U.S. English.
Apple plans to add support for additional languages in 2025, like Chinese, English (India), English (Singapore), French, German, Italian, Japanese, Korean, Portuguese, Spanish, and Vietnamese. Apple Intelligence will launch on the iPhone and the iPad in the European Union in April 2025. Related Roundups: iOS 18, iPadOS 18, macOS SequoiaRelated Forums: iOS 18, iPadOS 18, macOS Sequoia
This article, "These Apple Intelligence Features Aren't Coming Until 2025" first appeared on MacRumors.com
Discuss this article in our forums
These Apple Intelligence Features Aren't Coming Until 2025 - MacRumors
Image Playground
Image Playground can be used to generate images in cartoon-like styles using a text-based prompt. Apple has built-in suggestions and concepts that you can choose from, like costumes, locations, items, and themes.
While Image Playground is a standalone app, it is also integrated into the Messages and Notes app. In these apps, Apple can use context from what you've typed for image generation suggestions.
For inspiration, you can upload or take a photo, and then get a cartoon AI version of it, and you can also use images of friends and family members pulled from the People album in the Photos app.
You can start with a base suggestion or photo and then continue to add to it until you get what you want. You can remove suggestions at any time, and save your favorite creations for use in other apps. Anytime you create something with Image Playground, you'll get multiple options so you can choose the best one.
There are only Animation and Illustration styles for Image Playground, so there is no option for creating photorealistic images.
Image Wand
Image Wand is basically the same thing as Image Playground, but in the Notes app. When you have notes that you've taken, you can circle an empty spot or some text and Image Wand will add a contextually relevant image.
So if you have notes on photosynthesis, you can add in an image of a plant under the sun. Image Wand isn't able to generate complex images, so if you want a picture of the internal structure of mitochondria, you're out of luck, but it can make a stylized image featuring the organelle.
On an iPad, you can draw a rough sketch of what you want to add to your notes with an Apple Pencil, and then use Image Wand to generate something more polished. Image Wand works on both iPhone and iPad.
Genmoji
Genmoji are custom emoji characters that you can create using a text-based description. If there's an emoji you can't find but need, like a duck eating a sandwich or an alligator skateboarding, Genmoji can make it for you.
Genmoji aren't too far off from Image Playground images, but the generation system tends to want to add a person for a lot of requests. You can choose yourself or a friend or family member, or just use a generic emoji character as your base.
In Messages and other apps, Genmoji behave like emoji, but they're not going to display properly for anyone running an operating system earlier than iOS 18.1, or on an Android device. In iOS 18.1 and iOS 18.2, they work like emoji, but Genmoji are displayed as a blank box and an accompanying full-size image on other versions of iOS.
If you have iOS 18.1 and iOS 18.2 and someone sends a Genmoji, you can long press on it and tap the "Emoji Details" button to see what prompt was used to create it, plus you can add it to your own Genmoji/sticker collection to use.
To create a Genmoji character, tap into the Emoji keyboard and tap on the Emoji with a "+" button next to the search bar. From there, you can type in your idea.
Writing Tools
- Proofread text checks for spelling and grammar errors, including word choice and sentence structure. You can accept all suggestions with a tap or go through them one by one with explanations.
- Rewrite cleans up what you've written and shifts the tone without impacting your content. Options include Friendly, Professional, and Concise.
- You can select text and get a summary of it with Apple Intelligence. You can choose to create a paragraph, pull out key points, make a list, or create a table. Summaries are available in Mail, Messages, and more.
- As of iOS 18.2, there is an open-ended Writing Tools option that lets you describe a change you want to make to something you've written. You can choose any mood or writing style that you want, with varying degrees of success.
- Also in iOS 18.2, Writing Tools has a "Compose" feature that uses Siri ChatGPT integration. With this option, Siri can leverage ChatGPT to compose writing from scratch rather than just rewriting text.
You can select any text on your iPhone, iPad, or Mac and use Apple Intelligence to access Writing Tools for summaries and other features.
Siri
- There's a new glow around the edges of the display when Siri is activated, applicable to iPhone, iPad, and CarPlay. On Mac, the Siri window can be placed anywhere. The glow animates responsively to the sound of your voice so you can tell when Siri is listening without interrupting other things you're doing.
- A double tap at the bottom of the display brings up the Type to Siri interface so you can type requests instead of speaking them. On Mac, you need to press the Command key twice to bring up Type to Siri. Type to Siri includes suggested requests so you can get your questions answered faster.
- Siri can maintain context between requests so you can ask a question and then reference it in a second reply. If you ask about the temperature in Raleigh, for example, and then follow up with "what's the humidity?" Siri should know you mean in Raleigh.
- If you stumble over your words when speaking to Siri, or change what you're saying mid-sentence, Siri will follow along.
- Siri has Apple's product knowledge and support base for answering questions about your device's features and settings, and can even find settings when you don't know the exact name by using natural language search.
Siri ChatGPT Integration
Siri ChatGPT integration lets Siri hand requests over to OpenAI's ChatGPT in iOS 18.2. ChatGPT is off by default, but you can turn it on in the Apple Intelligence and Siri section of the Settings app.
If ChatGPT integration is enabled, Siri will consult ChatGPT for complex requests. Complex requests might include creating an image, generating text from scratch, making recipe ideas based on what's in your refrigerator, describing what's in a photo, and more.
Siri will analyze each request to see if it's something that needs to be answered by ChatGPT, but you can also automatically invoke ChatGPT for a request by using a request like "Ask ChatGPT to give me a chocolate chip cookie recipe."
Siri will ask your permission before querying ChatGPT, but there is an option to turn off that extra permission step by toggling off "Confirm ChatGPT Requests" in the ChatGPT section of Settings.
You don't need an account to use ChatGPT, and it is free, but if you have a paid account, you can sign in. If you're not signed in, OpenAI does not store any of your ChatGPT requests, nor is your information used for training ChatGPT. If you sign in, ChatGPT can save a copy of your queries. Apple does not store ChatGPT queries.
ChatGPT can be used with Siri, but it is also integrated into Writing Tools and Visual Intelligence. With Writing Tools, ChatGPT can generate text, and with Visual Intelligence, ChatGPT can answer questions about what the Camera sees.
- There is a summarize button for summarizing any of your incoming emails, plus you will see a brief summary of an email in your inbox list rather than the first few lines of the email.
- Mail surfaces time sensitive messages first when applicable, putting them at the top of your inbox so you see what's important right away.
- Smart Reply provides quick-tap responses to emails that you've been sent, with contextual options based on what's in the email.
- Multiple notifications from Mail will be summarized on your Lock Screen so you can see what's in an email without opening the app.
Messages
- Messages has Smart Reply options for incoming texts, which analyze the content of messages to offer suggestions of what you might want to say.
- Multiple Messages notifications are summarized on your Lock Screen.
- You can use all of the Writing Tools features in the Messages app for proofreading and refining what you're planning to send.
Photos
- You can create a Memory Movie with just a description, such as "My cat in 2024," or "Orlando in the summer." The feature automatically picks relevant photos and chooses songs, but you can tweak through the Memory Mixes feature or choose a mood to guide the direction of the audio. You can also add in specific scenes and images you want to see throughout the memory when you're creating the prompt.
- Natural language search is available in Photos, so you can just describe what you're looking for, such as "Eric rollerskating while wearing green."
- Search can also find specific moments in video clips.
- Search offers up smart complete suggestions for narrowing down what you might want to find.
Clean Up
The Photos app also includes "Clean Up," a feature that lets you remove unwanted objects from your photos. The Clean Up tool in the Photos app is able to automatically detect objects in an image that might not be wanted, but you can also tap, circle, or brush over an unwanted object to remove it.
Zooming in on an image can help with using a finger as a brush to remove smaller blemishes and issues with an image, and it is intelligent enough not to remove part of a person even if a person or main subject is selected.
Clean Up works on all images in the Photos library, including older images and images captured by other devices like a point and shoot camera or a DSLR.
Transcription Summaries
In Notes and other apps, you can record audio and get a transcript along with a summary of your transcript, which is useful for recording lectures and other audio. Transcription isn't an Apple Intelligence feature, but summaries are.
Focus Modes
There is a dedicated Reduce Interruptions Focus Mode that only shows you important notifications that need attention while filtering out everything else.
When customizing an existing Focus mode or creating a new one, there is a new toggle to turn on Intelligent Breakthrough and Silencing. This option allows important notifications to interrupt you, while muting notifications that are not important, which is the same thing that the Reduce Notifications Focus does. You can override the notification settings for specific apps and people, which is how it worked prior to iOS 18.1.
Notification Summaries
Your incoming notifications are summarized so you can see what's new for each app at a glance.
Phone
Summaries of transcriptions generated from your phone calls are supported in iOS 18.1 and later. The iOS 18.1 update adds the option to record a phone call and get a transcription, which is not an Apple Intelligence feature. What does require Apple Intelligence, though, is getting a summary from that transcription without reading through the entire thing.
Note that you can start a recording by tapping on the record button in the upper left corner of the display when on a phone call. All participants are notified that the call is being recorded before the recording starts.
Recorded phone calls are stored in the Notes app, where you can tap in to view a transcript and get a summary generated from that transcript.
Safari
When reading an article in Reader Mode, there is an option to have Apple Intelligence summarize the article for you.
Visual Intelligence
Visual Intelligence is an iPhone 16 feature that uses the Camera Control button. If you long press it, you can get into Visual Intelligence mode, where the Camera app can be used to identify what's around you.
If you point the camera at a store, for example, you can see ratings, hours, and other information. If you take a photo of an object, you can get more information about the object from ChatGPT, or use it with Google Search to find similar images. The Google Search feature is a good way to search for products that you want to find.
Other Visual Intelligence features include reading text out loud, detecting phone numbers and addresses to add them to the Contacts app, copying text, and summarizing long passages of text.
Additional Languages
In iOS 18.2, Apple Intelligence supports localized English in Australia, Canada, New Zealand, South Africa, Ireland, and the UK in addition to U.S. English.
Apple Intelligence Features Coming in 2025
There are additional Apple Intelligence features that will be coming in updates to iOS 18 in 2025.
Priority Notifications
Priority notifications will show up at the top of your notification stack, so you can get to what's most important first.
Siri
Some initial Siri updates are available in iOS 18.1, such as Siri's new glow that encompasses the edges of the display, and ChatGPT integration is coming in iOS 18.2, but we'll need to wait for iOS 18.3 and iOS 18.4 for additional Siri capabilities. Apple is working on onscreen awareness, personal context, and the ability to take more actions in and across apps.
Onscreen awareness will let Siri take actions when you ask something about what's on your display. If you're looking at a photo and want to message it to your friend Eric, you'll be able to tell Siri to "Send this picture to Eric," and Siri will understand and do it.
Personal context will let Siri do more with your personal data like emails and messages. This is an on-device feature, and it will let Siri learn more about you, who you're communicating with, and how you use your device. Personal context will let you do things like ask Siri to find a specific message, or remind you when you took a photo that you're looking for.
The Siri option to take more actions in and across apps will drastically improve what Siri is capable of. You'll be able to move files from one app to another and control app functions with Siri that you never could before. It'll work in third-party apps as well as Apple's own apps.
Sketch Style for Image Playground
As of right now, Image Playground has two different styles to choose from, animation and illustration. There is a third, though, that Apple plans to add in the future. Sketch is described as a "highly detailed and academic" style that "produces gorgeous drawings on stark backgrounds." It is distinct from the illustration style that has strong outlines, bold colors, and simple shapes, and the animation style that has a "whimsical, 3D cartoon look."
The animation, illustration, and sketch styles, respectively
If you want to see what the sketch style looks like, you can check it out in the Image Wand feature in the Notes app, where it is available now.
macOS Features
Memory Maker is in iOS 18.1 and iPadOS 18.1, but not macOS Sequoia 15.1. Genmoji is in iOS 18.2 and iPadOS 18.2, but not macOS Sequoia 15.2.
Apple plans to release Memory Maker and Genmoji for macOS Sequoia at a later time.
More Languages
Apple plans to add support for more languages in 2025, including Chinese, English (India), English (Singapore), French, German, Italian, Japanese, Korean, Portuguese, Spanish, and Vietnamese. Apple says that the first "set" of new languages will come in a software update in April (likely iOS 18.4), while more will come throughout the year.
Apple Intelligence Support in the European Union
Apple plans to start rolling out Apple Intelligence features to iPhone and iPad users in the European Union in April. EU users will officially get access to Writing Tools, Genmoji, the updated Siri with better language understanding, Siri ChatGPT integration, and more.
When to Expect More Apple Intelligence Features
More Apple Intelligence features will come in iOS 18.3 and iOS 18.4.
In late January or so, we'll get iOS 18.3, which could potentially have some new Siri features.
iOS 18.4, which isn't expected until around April 2025, will have the bulk of the Siri Apple Intelligence features. We're also expecting to see Apple roll out support for additional languages in 2025.
Apple Intelligence Device Requirements
Apple Intelligence requires a device with one of Apple's newest chips and at least 8GB RAM. Eligible models are listed below.
- All iPhone 16 models
- iPhone 15 Pro and iPhone 15 Pro Max
- All Apple silicon iPads
- A17 Pro iPad mini
- All Apple silicon Macs
Apple Intelligence Settings
In the Privacy and Security section of the Settings app you can access an Apple Intelligence Report that lets you export your Apple Intelligence data as part of Apple's promise for transparency around Apple Intelligence. Biometric authentication is required to access and export Apple Intelligence data.
You can also disable Apple Intelligence by toggling off the setting under the Apple Intelligence and Siri section in the Settings app.
Apple Intelligence Availability
Apple Intelligence is only available in U.S. English at this time, and it is not available to in the European Union (iPhone and iPad) or China. Device region and language need to be set to the United States.
In iOS 18.2, localized English in Australia, Canada, New Zealand, South Africa, and the UK is supported in addition to U.S. English.
Apple plans to add support for additional languages in 2025, like Chinese, English (India), English (Singapore), French, German, Italian, Japanese, Korean, Portuguese, Spanish, and Vietnamese. Apple Intelligence will launch on the iPhone and the iPad in the European Union in April 2025. Related Roundups: iOS 18, iPadOS 18, macOS SequoiaRelated Forums: iOS 18, iPadOS 18, macOS Sequoia
This article, "These Apple Intelligence Features Aren't Coming Until 2025" first appeared on MacRumors.com
Discuss this article in our forums
Housing as a Climate Resilience Strategy - Planetizen
As climate migration displaces more and more people around the world, building more resilient housing will become key to managing urban growth and shifts in population centers, writes Jonathan Reckford in Time. “According to the United Nations Environment Programme, the built environment accounts for 21% of global greenhouse gas emissions, with residential buildings responsible for 17% of the total emissions. As the planet warms and climate disasters intensify, housing’s role becomes even more critical—not only in reducing emissions but also providing stability and safety to those most affected.”
Housing-centric approaches that adapt communities and households to climate impacts, including comprehensive slum upgrading, are essential for climate actions and advancing sustainable development goals. These strategies not only address climate resilience but also provide pathways for improved health, education, and economic outcomes.
According to Habitat by Humanity research, “GDP and income per capita would increase by as much as 10.5% in some countries if housing in informal settlements were upgraded at a massive scale, and as many as 41.6 million additional children could be enrolled in school.” For Reckford, addressing the needs of people living in informal settlements — often without access to safe infrastructure or clean water — is a crucial part of building more sustainable and equitable communities.
Geography World Category Environment Housing Land Use Urban Development Tags- Climate Change
- Resilience
- Sustainability
- Climate Migration
- Informal Settlements
- informal housing
- Retrofits
- Energy Efficiency
- Displacement
Housing as a Climate Resilience Strategy - Planetizen
As climate migration displaces more and more people around the world, building more resilient housing will become key to managing urban growth and shifts in population centers, writes Jonathan Reckford in Time. “According to the United Nations Environment Programme, the built environment accounts for 21% of global greenhouse gas emissions, with residential buildings responsible for 17% of the total emissions. As the planet warms and climate disasters intensify, housing’s role becomes even more critical—not only in reducing emissions but also providing stability and safety to those most affected.”
Housing-centric approaches that adapt communities and households to climate impacts, including comprehensive slum upgrading, are essential for climate actions and advancing sustainable development goals. These strategies not only address climate resilience but also provide pathways for improved health, education, and economic outcomes.
According to Habitat by Humanity research, “GDP and income per capita would increase by as much as 10.5% in some countries if housing in informal settlements were upgraded at a massive scale, and as many as 41.6 million additional children could be enrolled in school.” For Reckford, addressing the needs of people living in informal settlements — often without access to safe infrastructure or clean water — is a crucial part of building more sustainable and equitable communities.
Geography World Category Environment Housing Land Use Urban Development Tags- Climate Change
- Resilience
- Sustainability
- Climate Migration
- Informal Settlements
- informal housing
- Retrofits
- Energy Efficiency
- Displacement
Fire likely killed a group of Stone Age humans uncovered in Ukraine - Popular Science
Some areas of modern Europe were pretty populated during the Stone Age. Archeological evidence shows that settlements in present day Ukraine may have had 10,000 to 15,000 people living there. Now, a new bioarcheological analysis was conducted on the remains of some of these Neolithic Europeans from an archaeological site near Kosenivka, Ukraine. The people who lived here over 5,600 years ago ate mostly plants, farmed, and some may have perished in an accidental fire. The first-of-their-kind findings are detailed in a study published December 11 in the open-access journal PLOS ONE.
Get the Popular Science newsletterBreakthroughs, discoveries, and DIY tips sent every weekday.
Email address Sign up Thank you!By signing up you agree to our Terms of Service and Privacy Policy.
The inhabitants of this settlement are associated with the Cucuteni-Trypilla culture. This group is primarily known for their pottery and lived across what is now Eastern Europe from roughly 5500 to 2750 BCE. Archeologists designate some of their settlements “mega-sites,” that were home to up to 15,000 people.
“The Trypilla societies were the first successful farmers in this area,” Katharina Fuchs, a study co-author and biological anthropologist and archaeologist from Kiel University in Germany, tells Popular Science. “They knew how to cultivate the environment, grow cereals and legumes, exploit the woodlands and breed their livestock. They are also known for their beautiful pottery, which they produced in a really high amount. Additionally, the settlement structures suggest early, quite complex sociopolitical systems to organize life in such megasites.”
Despite the numerous artifacts the Trypillia left behind, archeologists have not found many human remains. Without these skeletal records, many facets of their lives are still undiscovered, including how they treated their dead.
Archaeological context of Kosenivka. A: Map showing the location of the settlement of Kosenivka and the Chalcolithic sites referred to in the text. B: Photo showing the location of house 6 within the landscape. C: Photo showing house 6 being excavated, in 2004. CREDIT: Map: R. Hofmann. Photos: republished from Kruts et al. [22] under a CC BY license with permission from V. Chabanyuk, original copyright 2005). Fuchs et al., 2024, PLOS ONE.In this new study, Fuchs and an interdisciplinary team of researchers studied a settlement site near Kosenivka, Ukraine where 50 human bone and tooth fragments have been recovered. The bones were found among the remains of a house and appear to belong to at least seven individuals of mixed ages and gender who may have been inhabitants of the house. Four of the individuals also have heavily charred remains and the team investigated the potential causes of these burns.
The burnt pieces of bone were primarily found in the center of the house. Previous studies of these bones proposed that the inhabitants perished in a house fire. While closely studying the pieces under a microscope, the team concluded that the burning likely occurred quickly after their deaths and was accidental. The researchers believe that some individuals might have died of carbon monoxide poisoning, even if they managed to get out of the house.
[ Related: Europe’s oldest human-made megastructure may be at the bottom of the Baltic Sea. ]
According to additional radiocarbon dating, one of the individuals about 100 years later and the team could not connect this person’s death to the fire. Two other individuals had unhealed cranial injuries, which raises some questions about if violence played a role in their deaths.
The chemical analysis of bone and tooth fragments also revealed more on how they may have lived. The teeth found at the site have wear marks that indicate chewing on grains and other plant fibers to clean them.
“These Trypillian societies relied mainly on a plant-based diet and keeping cattle was not for meat production, but for milk production and to fertilise the fields,” says Fuchs.
A selection of oral and pathological conditions. A–E: Individual 5/6/+left maxilla. A: Teeth positions 23–26 (buccal view). Signs of periodontal inflammation (upper arrows) and examples of dental calculus accumulation (third arrow) and dental chipping (lower arrow) on the first premolar (tooth 24). B: First premolar (24, mesial view). Interproximal grooving with horizonal striations on the lingual surface of the root (upper arrow) and at the cemento–enamel junction (middle arrow). Larger chipping lesion (lower arrow). C: Canine (23, distal view). Interproximal grooving, same location as on the neighbouring premolar (see B), but less distinct. D, E: Signs of periosteal reaction on the left maxillary sinus (medio–superior view). Increased vessel impressions (D, upper arrow) and porosity, as well as uneven bone surface (D, lower arrow, E), indicating inflammatory processes. F: Individual 2, left temporal, fragment (endocranial view). Periosteal reaction indicated by porous new bone formation (arrow). G: Individual 5, frontal bone (endocranial view). Periosteal reaction indicated by tongue-like new bone formation and increased vessel impressions (arrows). H: Individual 5/6/+, frontal bone, right part, orbital roof (inferior view). Signs of cribra orbitalia (evidenced by porosity, see arrow). CREDIT: Fuchs et al., 2024, PLOS ONE.According to Fuchs, parts of this site were first excavated decades ago and have not been “destroyed directly by military offensives” in Ukraine. However, the work of archaeologists and other experts who work on cultural heritage sites like Kosenivka has heavily been impacted by the war, with damage reported to several buildings, including museums and churches. Since the invasion began in 2022, Kiel University archaeologists have committed to strengthening collaborations with Ukrainian colleagues in the face of the crisis and this study is part of that effort.
“Exploring our deep history has never been as important as it is today–our behaviour in and with the environment and, of course, with each other,” says Fuchs. “Bones are not an abstract thing, but the biological and chemical archive of a human life. Even the smallest bone fragments can help us to see ourselves in the mirror of the past and thus develop a different view of the present, but also of the future.”
The post Fire likely killed a group of Stone Age humans uncovered in Ukraine appeared first on Popular Science.
macOS Sequoia 15.2 Confirms New M4 MacBook Air Models Are Coming - MacRumors
The leaked software references the "MacBook Air (13-inch, M4, 2025)" and the "MacBook Air (15-inch, M4, 2025)," confirming that new M4 MacBook Air models are in development and are likely not too far off from launching.
It's been widely rumored that Apple is working to bring the M4 chips to its entire Mac lineup, and the MacBook Air is expected to get an M4 refresh in the spring of 2025, so sometime between March and June.
We are not expecting to see design updates for the MacBook Air, and the refresh will focus on the internals, namely the new M4 chip upgrade.Related Roundup: MacBook AirBuyer's Guide: 15" MacBook Air (Neutral), 13" MacBook Air (Caution)Related Forum: MacBook Air
This article, "macOS Sequoia 15.2 Confirms New M4 MacBook Air Models Are Coming" first appeared on MacRumors.com
Discuss this article in our forums
macOS Sequoia 15.2 Confirms New M4 MacBook Air Models Are Coming - MacRumors
The leaked software references the "MacBook Air (13-inch, M4, 2025)" and the "MacBook Air (15-inch, M4, 2025)," confirming that new M4 MacBook Air models are in development and are likely not too far off from launching.
It's been widely rumored that Apple is working to bring the M4 chips to its entire Mac lineup, and the MacBook Air is expected to get an M4 refresh in the spring of 2025, so sometime between March and June.
We are not expecting to see design updates for the MacBook Air, and the refresh will focus on the internals, namely the new M4 chip upgrade.Related Roundup: MacBook AirBuyer's Guide: 15" MacBook Air (Neutral), 13" MacBook Air (Caution)Related Forum: MacBook Air
This article, "macOS Sequoia 15.2 Confirms New M4 MacBook Air Models Are Coming" first appeared on MacRumors.com
Discuss this article in our forums
Apple Releases iOS 18.2 and iPadOS 18.2 With Genmoji, Image Playground, Siri ChatGPT and More - MacRumors
Subscribe to the MacRumors YouTube channel for more videos.
The new software can be downloaded on eligible iPhones and iPads over-the-air by going to Settings > General > Software Update.
iOS 18.2 and iPadOS 18.2 introduce new Apple Intelligence features, including the first image generation capabilities. Image Playground is a new app for creating images based on text descriptions, and you can add all kinds of costumes, items, backgrounds and more. You can even make your images look like your friends and family members.
Genmoji is similar to Image Playground, but it's for creating custom emoji characters that you can use in Messages. The third image generation feature is Image Wand, which is Image Playground but in the Notes app. You can make a rough sketch and use Apple Intelligence to make it better. The update includes ChatGPT integration for Siri, so Siri can hand complicated requests over to OpenAI's ChatGPT. For iPhone 16 users, the update adds Visual Intelligence to the Camera Control feature, so you can get more information on items and locations around you.
There are also new features for Photos, Safari, Mail, and more, with Apple's full release notes available below.
Apple Intelligence (All iPhone 16 models, iPhone 15 Pro, iPhone 15 Pro Max)
Image Playground
- A new app that lets you use concepts, descriptions, and people from your photo library to create fun, playful images in multiple styles
- Swipe through previews and choose from as you add concepts to your playground
- Choose from animation and illustration styles when creating your image
- Create images in Messages and Freeform, as well as third party apps
- Images are synced in your Image Playground library across all your devices with iCloud
Genmoji
- Genmoji allows you to create a custom emoji right from the keyboard
- Genmoji are synced in your sticker drawer across all your devices with iCloud
ChatGPT support
- ChatGPT from OpenAI can be accessed right from Siri or Writing Tools
- Compose in Writing Tools allows you to create something from scratch with ChatGPT
- Siri can tap into ChatGPT when relevant to provide you an answer
- A ChatGPT account is not required and your requests will be anonymous and won't be used to train OpenAI's models
- Sign in with ChatGPT to access your account benefits, and requests will be covered by OpenAI's data policies
- Image Wand turns sketches and handwritten or typed notes into images in Notes
- Describe your change in Writing Tools allows you to suggest how you'd like something rewritten, for example as a poem
- Camera Control (iPhone 16, iPhone 16 Plus, iPhone 16 Pro, iPhone 16 Pro Max)
- Visual Intelligence with Camera Control helps you instantly learn about places or interact with information simply by pointing your iPhone at the object, with the option to tap into Google Search or ChatGPT
- Camera Control two-stage shutter lets you lock focus and exposure in Camera when light pressing the Camera Control
- Mail Categorization sorts your messages to help you prioritize the most important messages
- Digest view groups all of the messages from one sender into a single bundle for easy browsing
Photos
- Video viewing improvements, including the ability to scrub frame-by-frame and a setting to turn off auto-looping video playback
- Improvements when navigating Collections views, including the ability to swipe right to go back to the previous view
- Recently Viewed and Recently Shared album history can be cleared
- Favorites album appears in the Utilities collection in addition to Pinned Collections
Safari
- New background images to customize your Safari Start Page
- Import and Export enables you to export your browsing data from Safari and import browsing data from another app into Safari
- HTTPS Priority upgrades URLs to HTTPS whenever possible
- File Download Live Activity shows the progress of a file download in the Dynamic Island and on your home screen
This update also includes the following improvements and bug fixes:
- Voice Memos supports layered recording, letting you add vocals over an existing song idea without the need for headphones -- then import your two-track projects directly into Logic Pro (iPhone 16 Pro, iPhone 16 Pro Max)
- Share Item Location in Find My helps you locate and recover misplaced items by easily and securely sharing the location of an AirTag or Find My network accessory with trusted third parties, such as airlines
- Natural language search in Apple Music and Apple TV app lets you describe what you're looking for using any combination of categories like genres, moods, actors, decades, and more
- Favorite Categories in Podcasts allows you to choose your favorite categories and get relevant show recommendations that you can easily access in your Library
- Personalized Search page in Podcasts highlights the most relevant categories and editorially curated collections tailored to you
- Sudoku for News+ Puzzles provided in three difficulty levels and available for News+ subscribers
- Support for the Hearing Test feature on AirPods Pro 2 in Cyprus, Czechia, France, Italy, Luxembourg, Romania, Spain, United Arab Emirates, and United Kingdom
- Support for the Hearing Aid feature on AirPods Pro 2 in United Arab Emirates
- Pre-market price quotes in Stocks lets you track NASDAQ and NYSE tickers prior to market open
- Fixes an issue where recently captured photos do not appear immediately in the All Photos grid
- Fixes an issue where Night mode photos in Camera could appear degraded when capturing long exposures (iPhone 16 Pro, iPhone 16 Pro Max)
Some features may not be available for all regions or on all Apple devices. For information on the security content of Apple software updates, please visit this website: https://support.apple.com/100100
We have a range of guides on the new Apple Intelligence features, which might be helpful if you're new to the update and wondering what you can do with Apple Intelligence.
- Everything You Need to Know About Apple Intelligence
- All of the Image Playground Features
- Genmoji in iOS 18.2: Everything You Need to Know
- Siri's Apple Intelligence Features
- Apple Intelligence Image Wand: All the New Features in iOS 18.2
More on the features available in iOS 18 can be found in our iOS 18 roundup.Related Roundups: iOS 18, iPadOS 18Related Forums: iOS 18, iPadOS 18
This article, "Apple Releases iOS 18.2 and iPadOS 18.2 With Genmoji, Image Playground, Siri ChatGPT and More" first appeared on MacRumors.com
Discuss this article in our forums
Apple Releases iOS 18.2 and iPadOS 18.2 With Genmoji, Image Playground, Siri ChatGPT and More - MacRumors
Subscribe to the MacRumors YouTube channel for more videos.
The new software can be downloaded on eligible iPhones and iPads over-the-air by going to Settings > General > Software Update.
iOS 18.2 and iPadOS 18.2 introduce new Apple Intelligence features, including the first image generation capabilities. Image Playground is a new app for creating images based on text descriptions, and you can add all kinds of costumes, items, backgrounds and more. You can even make your images look like your friends and family members.
Genmoji is similar to Image Playground, but it's for creating custom emoji characters that you can use in Messages. The third image generation feature is Image Wand, which is Image Playground but in the Notes app. You can make a rough sketch and use Apple Intelligence to make it better. The update includes ChatGPT integration for Siri, so Siri can hand complicated requests over to OpenAI's ChatGPT. For iPhone 16 users, the update adds Visual Intelligence to the Camera Control feature, so you can get more information on items and locations around you.
There are also new features for Photos, Safari, Mail, and more, with Apple's full release notes available below.
Apple Intelligence (All iPhone 16 models, iPhone 15 Pro, iPhone 15 Pro Max)
Image Playground
- A new app that lets you use concepts, descriptions, and people from your photo library to create fun, playful images in multiple styles
- Swipe through previews and choose from as you add concepts to your playground
- Choose from animation and illustration styles when creating your image
- Create images in Messages and Freeform, as well as third party apps
- Images are synced in your Image Playground library across all your devices with iCloud
Genmoji
- Genmoji allows you to create a custom emoji right from the keyboard
- Genmoji are synced in your sticker drawer across all your devices with iCloud
ChatGPT support
- ChatGPT from OpenAI can be accessed right from Siri or Writing Tools
- Compose in Writing Tools allows you to create something from scratch with ChatGPT
- Siri can tap into ChatGPT when relevant to provide you an answer
- A ChatGPT account is not required and your requests will be anonymous and won't be used to train OpenAI's models
- Sign in with ChatGPT to access your account benefits, and requests will be covered by OpenAI's data policies
- Image Wand turns sketches and handwritten or typed notes into images in Notes
- Describe your change in Writing Tools allows you to suggest how you'd like something rewritten, for example as a poem
- Camera Control (iPhone 16, iPhone 16 Plus, iPhone 16 Pro, iPhone 16 Pro Max)
- Visual Intelligence with Camera Control helps you instantly learn about places or interact with information simply by pointing your iPhone at the object, with the option to tap into Google Search or ChatGPT
- Camera Control two-stage shutter lets you lock focus and exposure in Camera when light pressing the Camera Control
- Mail Categorization sorts your messages to help you prioritize the most important messages
- Digest view groups all of the messages from one sender into a single bundle for easy browsing
Photos
- Video viewing improvements, including the ability to scrub frame-by-frame and a setting to turn off auto-looping video playback
- Improvements when navigating Collections views, including the ability to swipe right to go back to the previous view
- Recently Viewed and Recently Shared album history can be cleared
- Favorites album appears in the Utilities collection in addition to Pinned Collections
Safari
- New background images to customize your Safari Start Page
- Import and Export enables you to export your browsing data from Safari and import browsing data from another app into Safari
- HTTPS Priority upgrades URLs to HTTPS whenever possible
- File Download Live Activity shows the progress of a file download in the Dynamic Island and on your home screen
This update also includes the following improvements and bug fixes:
- Voice Memos supports layered recording, letting you add vocals over an existing song idea without the need for headphones -- then import your two-track projects directly into Logic Pro (iPhone 16 Pro, iPhone 16 Pro Max)
- Share Item Location in Find My helps you locate and recover misplaced items by easily and securely sharing the location of an AirTag or Find My network accessory with trusted third parties, such as airlines
- Natural language search in Apple Music and Apple TV app lets you describe what you're looking for using any combination of categories like genres, moods, actors, decades, and more
- Favorite Categories in Podcasts allows you to choose your favorite categories and get relevant show recommendations that you can easily access in your Library
- Personalized Search page in Podcasts highlights the most relevant categories and editorially curated collections tailored to you
- Sudoku for News+ Puzzles provided in three difficulty levels and available for News+ subscribers
- Support for the Hearing Test feature on AirPods Pro 2 in Cyprus, Czechia, France, Italy, Luxembourg, Romania, Spain, United Arab Emirates, and United Kingdom
- Support for the Hearing Aid feature on AirPods Pro 2 in United Arab Emirates
- Pre-market price quotes in Stocks lets you track NASDAQ and NYSE tickers prior to market open
- Fixes an issue where recently captured photos do not appear immediately in the All Photos grid
- Fixes an issue where Night mode photos in Camera could appear degraded when capturing long exposures (iPhone 16 Pro, iPhone 16 Pro Max)
Some features may not be available for all regions or on all Apple devices. For information on the security content of Apple software updates, please visit this website: https://support.apple.com/100100
We have a range of guides on the new Apple Intelligence features, which might be helpful if you're new to the update and wondering what you can do with Apple Intelligence.
- Everything You Need to Know About Apple Intelligence
- All of the Image Playground Features
- Genmoji in iOS 18.2: Everything You Need to Know
- Siri's Apple Intelligence Features
- Apple Intelligence Image Wand: All the New Features in iOS 18.2
More on the features available in iOS 18 can be found in our iOS 18 roundup.Related Roundups: iOS 18, iPadOS 18Related Forums: iOS 18, iPadOS 18
This article, "Apple Releases iOS 18.2 and iPadOS 18.2 With Genmoji, Image Playground, Siri ChatGPT and More" first appeared on MacRumors.com
Discuss this article in our forums
Apple Releases macOS Sequoia 15.2 With New Apple Intelligence Features - MacRumors
Mac users can download the macOS Sequoia update through the Software Update section of System Settings.
macOS Sequoia 15.2 adds Image Playground, an app that lets you create images based on text descriptions. You can type in whatever you like, but Apple will suggest costumes, locations, and items that you can add to an image. You can generate images that resemble your friend and family, and you can choose a photo for Image Playground to use as inspiration.
The update also adds ChatGPT integration to Siri, which is an opt-in feature. When enabled, Siri is able to hand complicated requests over to OpenAI's ChatGPT. Siri ChatGPT can essentially do much of what the dedicated ChatGPT app can do, but with the convenience of using Siri instead of a third-party app.
ChatGPT can generate text and realistic looking images from scratch, which isn't possible with Apple Intelligence, and it can tell you what you're looking at on your screen using a screenshot.
macOS Sequoia 15.2 also adds new features to Photos and Safari, with Apple's release notes for the update available below.
This update introduces new features powered by Apple Intelligence, the personal intelligence system that unlocks powerful new ways to communicate, work, and express yourself, all while protecting your data with an extraordinary step forward for privacy in AI.
New features include Image Playground which lets you create delightful, fun images, ChatGPT support integrated right into Siri and Writing Tools, and more. This release also includes enhancements to Photos and Safari, as well as other features, bug fixes, and security updates for your Mac.
Some features may not be available for all regions, or on all Apple devices.
For detailed information about the security content of this update, please visit: https://support.apple.com/100100
More information on the features that are new in macOS Sequoia can be found in our macOS Sequoia roundup.Related Roundup: macOS SequoiaRelated Forum: macOS Sequoia
This article, "Apple Releases macOS Sequoia 15.2 With New Apple Intelligence Features" first appeared on MacRumors.com
Discuss this article in our forums
Apple Releases macOS Sequoia 15.2 With New Apple Intelligence Features - MacRumors
Mac users can download the macOS Sequoia update through the Software Update section of System Settings.
macOS Sequoia 15.2 adds Image Playground, an app that lets you create images based on text descriptions. You can type in whatever you like, but Apple will suggest costumes, locations, and items that you can add to an image. You can generate images that resemble your friend and family, and you can choose a photo for Image Playground to use as inspiration.
The update also adds ChatGPT integration to Siri, which is an opt-in feature. When enabled, Siri is able to hand complicated requests over to OpenAI's ChatGPT. Siri ChatGPT can essentially do much of what the dedicated ChatGPT app can do, but with the convenience of using Siri instead of a third-party app.
ChatGPT can generate text and realistic looking images from scratch, which isn't possible with Apple Intelligence, and it can tell you what you're looking at on your screen using a screenshot.
macOS Sequoia 15.2 also adds new features to Photos and Safari, with Apple's release notes for the update available below.
This update introduces new features powered by Apple Intelligence, the personal intelligence system that unlocks powerful new ways to communicate, work, and express yourself, all while protecting your data with an extraordinary step forward for privacy in AI.
New features include Image Playground which lets you create delightful, fun images, ChatGPT support integrated right into Siri and Writing Tools, and more. This release also includes enhancements to Photos and Safari, as well as other features, bug fixes, and security updates for your Mac.
Some features may not be available for all regions, or on all Apple devices.
For detailed information about the security content of this update, please visit: https://support.apple.com/100100
More information on the features that are new in macOS Sequoia can be found in our macOS Sequoia roundup.Related Roundup: macOS SequoiaRelated Forum: macOS Sequoia
This article, "Apple Releases macOS Sequoia 15.2 With New Apple Intelligence Features" first appeared on MacRumors.com
Discuss this article in our forums
Apple Releases HomePod Software 18.2 With Natural Language Search for Apple Music - MacRumors
The new HomePod software adds support for Apple Music natural language search, which means you can describe what you want to hear in more casual language.
You can ask Siri for music using different combinations of genres, moods, activity, decades, and more. Examples include "songs about cats," "songs with a vibe," "relaxing songs," "artists similar to Taylor Swift," "sad 80s songs," and "songs about food."
The update also improves Enhance Dialogue with the Apple TV 4K to make speech clearer over background sounds.
Software version 18.2 includes bug fixes and stability improvements.
Siri on HomePod is now integrated with Apple Music natural language search so you can describe what you want to hear using any combination of categories like genre, mood, decade or activity.
Enhance Dialogue on HomePod (2nd generation) when paired with Apple TV 4K gives you the option to hear speech more clearly over background sounds using real-time audio processing and machine learning.
HomePod software is installed automatically on the HomePod unless the feature is disabled, but the HomePod can also be manually updated in the Home app by tapping on the More button, choosing Home Settings, and then selecting the Software Update option.Related Roundups: HomePod, HomePod miniBuyer's Guide: HomePod (Neutral), HomePod Mini (Caution)Related Forum: HomePod, HomeKit, CarPlay, Home & Auto Technology
This article, "Apple Releases HomePod Software 18.2 With Natural Language Search for Apple Music" first appeared on MacRumors.com
Discuss this article in our forums
Apple Releases HomePod Software 18.2 With Natural Language Search for Apple Music - MacRumors
The new HomePod software adds support for Apple Music natural language search, which means you can describe what you want to hear in more casual language.
You can ask Siri for music using different combinations of genres, moods, activity, decades, and more. Examples include "songs about cats," "songs with a vibe," "relaxing songs," "artists similar to Taylor Swift," "sad 80s songs," and "songs about food."
The update also improves Enhance Dialogue with the Apple TV 4K to make speech clearer over background sounds.
Software version 18.2 includes bug fixes and stability improvements.
Siri on HomePod is now integrated with Apple Music natural language search so you can describe what you want to hear using any combination of categories like genre, mood, decade or activity.
Enhance Dialogue on HomePod (2nd generation) when paired with Apple TV 4K gives you the option to hear speech more clearly over background sounds using real-time audio processing and machine learning.
HomePod software is installed automatically on the HomePod unless the feature is disabled, but the HomePod can also be manually updated in the Home app by tapping on the More button, choosing Home Settings, and then selecting the Software Update option.Related Roundups: HomePod, HomePod miniBuyer's Guide: HomePod (Neutral), HomePod Mini (Caution)Related Forum: HomePod, HomeKit, CarPlay, Home & Auto Technology
This article, "Apple Releases HomePod Software 18.2 With Natural Language Search for Apple Music" first appeared on MacRumors.com
Discuss this article in our forums
Southeast LA Road Safety Advocates Call for Improved Infrastructure - Planetizen
It’s no secret that Los Angeles lacks a comprehensive network of protected bike lanes. According to an article by Amanda Del Cid Lugo in LA Public Press, the crisis is even more urgent in southeast Los Angeles County, where people who commute by bike or on foot often face dangerous road conditions on a daily basis.
“According to data from the University of California Berkeley’s Transportation Injury Mapping System, in the 10-year span from 2013 and 2023, 553 people were hit by cars in the city of Huntington Park alone, with the majority of accidents happening along Slauson Avenue, Gage Avenue, Pacific Boulevard, and Florence Avenue,” Del Cid Lugo explains.
A Berkeley study found that the leading cause of collisions in Huntington Park was drivers failing to yield to pedestrians and people on bikes and that a lack of traffic safety infrastructure and lighting poses a barrier to walking and biking.
Geography California Category Infrastructure Transportation Tags- Los Angeles County
- Southeast Los Angeles
- Bike Lanes
- Road Safety
- Traffic Safety
- Complete Streets
- Vision Zero
- Biking
- Walking
- Pedestrian infrastructure
- Bike Infrastructure
- Pedestrian Safety
Southeast LA Road Safety Advocates Call for Improved Infrastructure - Planetizen
It’s no secret that Los Angeles lacks a comprehensive network of protected bike lanes. According to an article by Amanda Del Cid Lugo in LA Public Press, the crisis is even more urgent in southeast Los Angeles County, where people who commute by bike or on foot often face dangerous road conditions on a daily basis.
“According to data from the University of California Berkeley’s Transportation Injury Mapping System, in the 10-year span from 2013 and 2023, 553 people were hit by cars in the city of Huntington Park alone, with the majority of accidents happening along Slauson Avenue, Gage Avenue, Pacific Boulevard, and Florence Avenue,” Del Cid Lugo explains.
A Berkeley study found that the leading cause of collisions in Huntington Park was drivers failing to yield to pedestrians and people on bikes and that a lack of traffic safety infrastructure and lighting poses a barrier to walking and biking.
Geography California Category Infrastructure Transportation Tags- Los Angeles County
- Southeast Los Angeles
- Bike Lanes
- Road Safety
- Traffic Safety
- Complete Streets
- Vision Zero
- Biking
- Walking
- Pedestrian infrastructure
- Bike Infrastructure
- Pedestrian Safety
Apple Releases visionOS 2.2 With Ultrawide Mac Virtual Display - MacRumors
visionOS 2.2 can be downloaded on all Vision Pro headsets by navigating to the Settings app, selecting the General section, and choosing the Software Update option.
To install an update, the Vision Pro headset needs to be removed, and there is a software progress bar available on the front EyeSight display.
visionOS 2.2 adds new wide and ultrawide aspect ratios to the Mac Virtual Display feature, so you can have more workspace when using the Vision Pro as a display for your Mac. Apple says that the ultrawide setting is the equivalent of two 5K monitors side by side.
The update also includes Multiview for watching up to five MLS and MLB games at once, and it lets you view spatial photos and videos that are embedded on web pages. Apple's full release notes for the update are below.
Mac Virtual Display
- Use Mac apps and games with a new aspect ratio: wide (21:9) and ultrawide (32:9) - the equivalent of two 5K monitors side by side
- Route audio from your Mac to Apple Vision Pro
Apple TV
- Watch up to five MLS and MLB games at once with Multiview
- Watch live sporting events together with SharePlay
Safari
- Tap to view spatial photos and videos embedded on web pages
Some features may not be available for all regions. For more information, please visit this website: https://www.apple.com/visionos/visionos-2
For information on the security content of Apple software updates, please visit this website: https://support.apple.com/100100
More information on the Vision Pro and visionOS 2 can be found in our roundup.Related Roundup: visionOS 2Related Forum: Apple Vision Pro
This article, "Apple Releases visionOS 2.2 With Ultrawide Mac Virtual Display" first appeared on MacRumors.com
Discuss this article in our forums
Apple Releases visionOS 2.2 With Ultrawide Mac Virtual Display - MacRumors
visionOS 2.2 can be downloaded on all Vision Pro headsets by navigating to the Settings app, selecting the General section, and choosing the Software Update option.
To install an update, the Vision Pro headset needs to be removed, and there is a software progress bar available on the front EyeSight display.
visionOS 2.2 adds new wide and ultrawide aspect ratios to the Mac Virtual Display feature, so you can have more workspace when using the Vision Pro as a display for your Mac. Apple says that the ultrawide setting is the equivalent of two 5K monitors side by side.
The update also includes Multiview for watching up to five MLS and MLB games at once, and it lets you view spatial photos and videos that are embedded on web pages. Apple's full release notes for the update are below.
Mac Virtual Display
- Use Mac apps and games with a new aspect ratio: wide (21:9) and ultrawide (32:9) - the equivalent of two 5K monitors side by side
- Route audio from your Mac to Apple Vision Pro
Apple TV
- Watch up to five MLS and MLB games at once with Multiview
- Watch live sporting events together with SharePlay
Safari
- Tap to view spatial photos and videos embedded on web pages
Some features may not be available for all regions. For more information, please visit this website: https://www.apple.com/visionos/visionos-2
For information on the security content of Apple software updates, please visit this website: https://support.apple.com/100100
More information on the Vision Pro and visionOS 2 can be found in our roundup.Related Roundup: visionOS 2Related Forum: Apple Vision Pro
This article, "Apple Releases visionOS 2.2 With Ultrawide Mac Virtual Display" first appeared on MacRumors.com
Discuss this article in our forums
Apple Releases tvOS 18.2 With Snoopy Screen Savers and Projector Support - MacRumors
tvOS 18.2 can be downloaded using the Settings app on the Apple TV. Open up Settings and go to System > Software Update to get the new software. Apple TV owners who have automatic software updates activated will be upgraded to tvOS 18.2 automatically.
tvOS 18.2 adds a selection of new Snoopy screen savers that are available as an alternative to the aerial, memory, and portrait screen saver options, plus it includes natural language search support for Siri for looking for movies, music, and TV shows.
The update also includes an option that lets the Apple TV automatically detect the best aspect ratio for a television or projector, along with new aspect ratios for projectors. Options include 16:9, 21:9, 2.37:1, 2.39:1, 2.40:1, DCI 4K, and 32:9.
Apple shares full release notes for tvOS in its tvOS support document, which is updated after each new version of tvOS comes out.
This article, "Apple Releases tvOS 18.2 With Snoopy Screen Savers and Projector Support" first appeared on MacRumors.com
Discuss this article in our forums
Apple Releases tvOS 18.2 With Snoopy Screen Savers and Projector Support - MacRumors
tvOS 18.2 can be downloaded using the Settings app on the Apple TV. Open up Settings and go to System > Software Update to get the new software. Apple TV owners who have automatic software updates activated will be upgraded to tvOS 18.2 automatically.
tvOS 18.2 adds a selection of new Snoopy screen savers that are available as an alternative to the aerial, memory, and portrait screen saver options, plus it includes natural language search support for Siri for looking for movies, music, and TV shows.
The update also includes an option that lets the Apple TV automatically detect the best aspect ratio for a television or projector, along with new aspect ratios for projectors. Options include 16:9, 21:9, 2.37:1, 2.39:1, 2.40:1, DCI 4K, and 32:9.
Apple shares full release notes for tvOS in its tvOS support document, which is updated after each new version of tvOS comes out.
This article, "Apple Releases tvOS 18.2 With Snoopy Screen Savers and Projector Support" first appeared on MacRumors.com
Discuss this article in our forums
Apple Releases watchOS 11.2 - MacRumors
watchOS 11.2 can be downloaded on an iPhone running iOS 18.2 by opening up the Apple Watch app and going to General > Software Update. To install the new software, the Apple Watch needs to have at least 50 percent battery and it needs to be placed on a charger.
According to Apple's release notes, watchOS 11.2 lets you pause video that you're recording on the iPhone using the Camera Remote app on Apple Watch. Apple's full notes are below.
This update includes improvements for your Apple Watch, including:
- Tides app expands map support for tidal conditions and coastal locations in China
- Camera Remote app can pause recording of iPhone video
For information on the security content of Apple software updates, please visit this website: https://support.apple.com/100100
More of the features available in watchOS 11 can be found in our watchOS 11 roundup.Related Roundup: watchOS 11Related Forum: Apple Watch
This article, "Apple Releases watchOS 11.2" first appeared on MacRumors.com
Discuss this article in our forums
Apple Releases watchOS 11.2 - MacRumors
watchOS 11.2 can be downloaded on an iPhone running iOS 18.2 by opening up the Apple Watch app and going to General > Software Update. To install the new software, the Apple Watch needs to have at least 50 percent battery and it needs to be placed on a charger.
According to Apple's release notes, watchOS 11.2 lets you pause video that you're recording on the iPhone using the Camera Remote app on Apple Watch. Apple's full notes are below.
This update includes improvements for your Apple Watch, including:
- Tides app expands map support for tidal conditions and coastal locations in China
- Camera Remote app can pause recording of iPhone video
For information on the security content of Apple software updates, please visit this website: https://support.apple.com/100100
More of the features available in watchOS 11 can be found in our watchOS 11 roundup.Related Roundup: watchOS 11Related Forum: Apple Watch
This article, "Apple Releases watchOS 11.2" first appeared on MacRumors.com
Discuss this article in our forums
Apple Intelligence Officially Launching Today in the UK, Canada, and Four More Countries - MacRumors
Apple recently announced that iOS 18.2, iPadOS 18.2, and macOS Sequoia 15.2 will be released today, following more than six weeks of beta testing. These software updates make Apple Intelligence available in localized English spoken in Canada, Australia, New Zealand, Ireland, the U.K., and South Africa, meaning the features are formally launching in those countries for the first time. Until now, using Apple Intelligence internationally required setting the iPhone's device and Siri language to U.S. English.
Apple Intelligence is currently compatible with the iPhone 15 Pro, iPhone 15 Pro Max, all four iPhone 16 models, any Mac with an M-series chip, any iPad with an M-series chip, and the latest iPad mini model with the A17 Pro chip.
On the iPhone, the first Apple Intelligence features debuted in U.S. English as part of iOS 18.1, with key features including writing tools for summarizing and proofreading text, suggested replies in the Messages and Mail apps, notification summaries, and more. iOS 18.2 adds additional Apple Intelligence features, including Genmoji for creating custom emoji, Image Playground for generating images, ChatGPT integration for Siri, and more. Exclusive to iPhone 16 models running iOS 18.2 is a new Visual Intelligence feature that allows you to quickly identify real-world things around you using the Camera Control button.
Apple Intelligence will continue to expand, with support for English (India), English (Singapore), French, German, Italian, Chinese, Japanese, Korean, Portuguese, Spanish, and Vietnamese set to roll out throughout 2025. And more features will debut next year, such as on-screen awareness and deeper per-app controls for Siri, likely as part of iOS 18.4 in April. And with iOS 19.4 in 2026, Siri is expected to become more conversational, like ChatGPT.Related Roundups: iOS 18, iPadOS 18, macOS SequoiaTag: Apple IntelligenceRelated Forums: iOS 18, iPadOS 18, macOS Sequoia
This article, "Apple Intelligence Officially Launching Today in the UK, Canada, and Four More Countries" first appeared on MacRumors.com
Discuss this article in our forums
Apple Intelligence Officially Launching Today in the UK, Canada, and Four More Countries - MacRumors
Apple recently announced that iOS 18.2, iPadOS 18.2, and macOS Sequoia 15.2 will be released today, following more than six weeks of beta testing. These software updates make Apple Intelligence available in localized English spoken in Canada, Australia, New Zealand, Ireland, the U.K., and South Africa, meaning the features are formally launching in those countries for the first time. Until now, using Apple Intelligence internationally required setting the iPhone's device and Siri language to U.S. English.
Apple Intelligence is currently compatible with the iPhone 15 Pro, iPhone 15 Pro Max, all four iPhone 16 models, any Mac with an M-series chip, any iPad with an M-series chip, and the latest iPad mini model with the A17 Pro chip.
On the iPhone, the first Apple Intelligence features debuted in U.S. English as part of iOS 18.1, with key features including writing tools for summarizing and proofreading text, suggested replies in the Messages and Mail apps, notification summaries, and more. iOS 18.2 adds additional Apple Intelligence features, including Genmoji for creating custom emoji, Image Playground for generating images, ChatGPT integration for Siri, and more. Exclusive to iPhone 16 models running iOS 18.2 is a new Visual Intelligence feature that allows you to quickly identify real-world things around you using the Camera Control button.
Apple Intelligence will continue to expand, with support for English (India), English (Singapore), French, German, Italian, Chinese, Japanese, Korean, Portuguese, Spanish, and Vietnamese set to roll out throughout 2025. And more features will debut next year, such as on-screen awareness and deeper per-app controls for Siri, likely as part of iOS 18.4 in April. And with iOS 19.4 in 2026, Siri is expected to become more conversational, like ChatGPT.Related Roundups: iOS 18, iPadOS 18, macOS SequoiaTag: Apple IntelligenceRelated Forums: iOS 18, iPadOS 18, macOS Sequoia
This article, "Apple Intelligence Officially Launching Today in the UK, Canada, and Four More Countries" first appeared on MacRumors.com
Discuss this article in our forums
Apple Vision Pro Named 2024 'Innovation of the Year' - MacRumors
Popular Science's annual list of the 50 greatest innovations celebrates groundbreaking achievements in science and technology, and this year, Apple's Vision Pro earned the top spot. According to Popular Science, the device represents a pivotal moment in AR innovation, providing a glimpse of the future of immersive computing.
The Vision Pro boasts a 23-million-pixel display system, delivering over 4K resolution to each eye. External cameras on the headset capture a live feed of the user's surroundings, blending the digital and physical worlds in real-time. The absence of traditional controllers or physical buttons further distinguishes the Vision Pro from its competitors, as users interact with its interface using hand gestures, eye tracking, and voice commands. Popular Science noted that these features make it "something different, important, and honestly pretty amazing."
The publication acknowledged some of the headset's hurdles, including its "massive $3,500 price tag," but emphasized its potential to drive future innovation in the AR space:While AR headsets have existed before, this one gets our award because of how much potential it shows. It's part of Apple's overall hardware plan. The new iPhone cameras have a specific arrangement to shoot spatial video for consumption in AR. Familiar apps can offer augmented experiences specifically meant for headsets. We expect the next version of the hardware will skip the creepy image of your eyes that shows up on the exterior screen. Still, we're curious to see what Apple does next, because a consumer-friendly price on an experience like this could be a true game changer. Other innovations highlighted by Popular Science this year include the Oura Ring 4, Sony A9 III mirrorless camera, Boox Palma e-ink smartphone, and LG Signature OLED T TV. See the full list for more information.Related Roundup: Apple Vision ProBuyer's Guide: Vision Pro (Buy Now)Related Forum: Apple Vision Pro
This article, "Apple Vision Pro Named 2024 'Innovation of the Year'" first appeared on MacRumors.com
Discuss this article in our forums
Apple Vision Pro Named 2024 'Innovation of the Year' - MacRumors
Popular Science's annual list of the 50 greatest innovations celebrates groundbreaking achievements in science and technology, and this year, Apple's Vision Pro earned the top spot. According to Popular Science, the device represents a pivotal moment in AR innovation, providing a glimpse of the future of immersive computing.
The Vision Pro boasts a 23-million-pixel display system, delivering over 4K resolution to each eye. External cameras on the headset capture a live feed of the user's surroundings, blending the digital and physical worlds in real-time. The absence of traditional controllers or physical buttons further distinguishes the Vision Pro from its competitors, as users interact with its interface using hand gestures, eye tracking, and voice commands. Popular Science noted that these features make it "something different, important, and honestly pretty amazing."
The publication acknowledged some of the headset's hurdles, including its "massive $3,500 price tag," but emphasized its potential to drive future innovation in the AR space:While AR headsets have existed before, this one gets our award because of how much potential it shows. It's part of Apple's overall hardware plan. The new iPhone cameras have a specific arrangement to shoot spatial video for consumption in AR. Familiar apps can offer augmented experiences specifically meant for headsets. We expect the next version of the hardware will skip the creepy image of your eyes that shows up on the exterior screen. Still, we're curious to see what Apple does next, because a consumer-friendly price on an experience like this could be a true game changer. Other innovations highlighted by Popular Science this year include the Oura Ring 4, Sony A9 III mirrorless camera, Boox Palma e-ink smartphone, and LG Signature OLED T TV. See the full list for more information.Related Roundup: Apple Vision ProBuyer's Guide: Vision Pro (Buy Now)Related Forum: Apple Vision Pro
This article, "Apple Vision Pro Named 2024 'Innovation of the Year'" first appeared on MacRumors.com
Discuss this article in our forums
USDOT: Low-Income Households Bear Highest Transportation Cost Burden - Planetizen
The lowest-income U.S. households shoulder the biggest burden, proportionally, when it comes to transportation costs, according to a report from the U.S. Department of Transportation’s Bureau of Transportation Statistics.
Dan Zukowski outlines the findings in Smart Cities Dive, noting that “Although households with incomes of $28,261 or less spent the least on transportation overall, those expenses consumed nearly 32% of their pre-tax income.” Meanwhile, households at the higher end of the income scale (above $148,682) spent 9.6 percent of their income on transportation, though they spent more overall.
Transportation expenses were the second-largest average household cost for all income levels behind housing, and vehicle ownership and maintenance made up the largest purchases. “Average transportation costs for households in 10 U.S. cities jumped more than 41% over a 10-year period leading up to 2022-2023, according to a separate report from the New York State Comptroller in October.”
Geography United States Category Community / Economic Development Social / Demographics Transportation Tags Publication Smart Cities Dive Publication Date Tue, 12/10/2024 - 12:00 Publication Links Lowest-income households face highest transportation cost burden: federal repor… 1 minuteUSDOT: Low-Income Households Bear Highest Transportation Cost Burden - Planetizen
The lowest-income U.S. households shoulder the biggest burden, proportionally, when it comes to transportation costs, according to a report from the U.S. Department of Transportation’s Bureau of Transportation Statistics.
Dan Zukowski outlines the findings in Smart Cities Dive, noting that “Although households with incomes of $28,261 or less spent the least on transportation overall, those expenses consumed nearly 32% of their pre-tax income.” Meanwhile, households at the higher end of the income scale (above $148,682) spent 9.6 percent of their income on transportation, though they spent more overall.
Transportation expenses were the second-largest average household cost for all income levels behind housing, and vehicle ownership and maintenance made up the largest purchases. “Average transportation costs for households in 10 U.S. cities jumped more than 41% over a 10-year period leading up to 2022-2023, according to a separate report from the New York State Comptroller in October.”
Geography United States Category Community / Economic Development Social / Demographics Transportation Tags Publication Smart Cities Dive Publication Date Tue, 12/10/2024 - 12:00 Publication Links Lowest-income households face highest transportation cost burden: federal repor… 1 minuteGM is killing Cruise robotaxis - Popular Science
General Motors is officially ending its support for Crusie’s beleaguered fleet of self-driving “robotaxis.” In a surprise announcement this week, the US carmaker said it will “realign its autonomous driving strategy” to end robotaxis and instead focus on eventually creating an autonomous personal vehicle. Cruise, which previously operated as a subsidiary, will now be fully absorbed by GM. That’s all a major departure for the driverless car company which had its sights set on offering paid robotaxis rides in multiple cities next year. Cruise previously proclaimed it planned to have close to a million of its autonomous vehicles flooding US streets by the end of the decade. In the end, safety concerns and a bungled 2023 California rollout may have left the company with wounds too deep to ever fully heal.
Soaring costs and stiff competition killed CrusieGM revealed the news via press release Tuesday afternoon. The carmaker cited growing costs and increased competition as the primary drivers for the change. The company said it would have required “considerable time and resources” needed to properly scale the robotaxi business up to make it viable. Cruise vehicles were still being tested in multiple cities this week. Now, as part of GM, Cruise will work on improving GM’s “Super Cruise” driver assist features in its lineup of vehicles. GM says the decision to end its robotaxi business could save the company $1 billion per year.
Get the Popular Science newsletterBreakthroughs, discoveries, and DIY tips sent every weekday.
Email address Sign up Thank you!By signing up you agree to our Terms of Service and Privacy Policy.
“You have to understand the cost of running a robo taxi fleet, which is not our core business and is very expensive,” GM CEO Mary Barra said during a conference call with Wall Street analysts first reported in The New York Times.
Cruise, which maintained one of the largest robotaxi operations in the US, was primarily competing against Alphabet-owned Waymo, which already offers thousands of paid driverless rides in San Francisco, Los Angeles every day. Amazon-backed Zoox, currently testing in multiple cities, is also expected to launch its robo taxi operation in Las Vegas next year. Though Cruise’s robotaxi dreams are effectively dead, GM said it’s still holding out hope for a fully autonomous personal vehicle. The immediate focus, however, seems to be on more restrained (and less controversial) advanced driver assist systems (ADAS).
“As the largest U.S. automotive manufacturer, we’re fully committed to autonomous driving and excited to bring GM customers its benefits–things like enhanced safety, improved traffic flow, increased accessibility, and reduced driver stress,” GM senior vice president of software and services engineering Dave Richardson said in the press release.
The news reportedly came as a shock to Cruise engineers, several of whom told TechCrunch they heard about the news at the same time as the media. Others told the outlet they were “blindsided” by the change of focus and are expecting layoffs as a result. Employees speaking with TechCrunch suspect the cuts could impact non-engineering roles like government affairs and remote assistance teams that were operating in Houston, Phoenix, and other test cities. GM did not immediately respond to our request for comment but noted in its press release that it was working with Crusie to “restructure and refocus” the unit’s operations. A spokesperson from GM said it was “too early to provide specifics” regarding layoffs. The company expects the restructuring to be complete by mid 2025.
[ Related: Why are ‘driverless’ cars still hitting things? ]
Public pushback and safety issues plagued 2023 rolloutThe surprise decision comes after Cruise has spent the better part of a year trying to repair a damaged reputation. After years of research and development, Cruise was finally granted approval to begin operating its robotaxis in San Francisco in August 2023. But within days, reports surfaced of Crusie vehicles blocking traffic, running through stop signs, and swerving to avoid people on the road. That rough rollout inspired immediate push-back and criticism from activists, some of which attached traffic cones to the hoods of cars to fool Cruise sensors and immobilize them. All of the pressure reached a boiling point in October 2023, when a pedestrian was hit by a vehicle and flung underneath a nearby Cruise vehicle. The robotaxi dragged the pedestrian for 20 feet before finally pulling over. University of San Francisco Professor and autonomous vehicle expert William Riggs previously told Popular Science that tragic mistake was likely the result of Crusie failing to consider placing sensors capable of detecting humans underneath their vehicles.
“There wasn’t a camera underneath the vehicle, the engineers couldn’t see somebody was there,” Riggs said. “It was truly something that no one had ever thought of.”
The California DMV revoked Curise’s license to operate in the state following the dragging incident. That led to the swift resignation of then CEO Kyle Vogt. Not long after that, the company laid off 900 employees, or around 24% of its workforce. After months in the dark, Cruise attempted to begin to bounce back this year. The company reportedly received $850 million in support from GM over the summer to help relaunch testing operations in Phoenix, Dallas, and Houston. Just three months ago Cruise announced a partnership with Uber that would have reportedly let ride hailers access its robotaxis through the Uber app in several cities next year. That’s no longer on the table.
Ultimately, Cruise was never able to fully recover from its botched San Francisco rollout. Polling shows drivers are still concerned about driverless vehicle safety and the repeated missteps did little to alleviate those worries. Cruise also backtracked and ceded ground to Waymo, its primary competitor, at precisely the wrong time. Waymo is already offering thousands of paid trips every day and expanding operations to more cities next. In hindsight, it’s unclear whether or not Cruise would ever be able to close that distance and release a taxi that could meaningfully compete with Waymo and other deep-pocketed rivals.
The post GM is killing Cruise robotaxis appeared first on Popular Science.
Minneapolis Awards Affordable Housing Funds - Planetizen
The city of Minneapolis announced a nearly $18-million investment in affordable housing programs aimed at assisting renters. “Proposals were evaluated based on underwriting standards, financial feasibility, location, project readiness, design guidelines and more. The funds are provided as a deferred loan with a 30- to 40-year term.”
According to a press release from the city, the funding was made available through a request-for-proposal process to organizations building affordable rental housing.
The funding, which was awarded to 11 projects across the city, comes in part from the Affordable Housing Trust Fund Program (AHTF) and the Federal Low Income Housing Tax Credit Program.
Geography Minnesota Category Housing Tags- Minneapolis
- Affordable Housing
- Affordable Housing Trust Fund Program
- Low Income Housing Tax Credit
- Housing Assistance
Minneapolis Awards Affordable Housing Funds - Planetizen
The city of Minneapolis announced a nearly $18-million investment in affordable housing programs aimed at assisting renters. “Proposals were evaluated based on underwriting standards, financial feasibility, location, project readiness, design guidelines and more. The funds are provided as a deferred loan with a 30- to 40-year term.”
According to a press release from the city, the funding was made available through a request-for-proposal process to organizations building affordable rental housing.
The funding, which was awarded to 11 projects across the city, comes in part from the Affordable Housing Trust Fund Program (AHTF) and the Federal Low Income Housing Tax Credit Program.
Geography Minnesota Category Housing Tags- Minneapolis
- Affordable Housing
- Affordable Housing Trust Fund Program
- Low Income Housing Tax Credit
- Housing Assistance
Why are crocodiles so bumpy? A dermatological mystery has been solved - Popular Science
For reptiles, crocodiles have some pretty distinct skin. Instead of the sleek and smooth scales on snakes and lizards, crocodiles have a more lumpy and three dimensional pattern on their scaly heads. Just how its unique skin is formed has puzzled scientists–until now. The pattern of scales on their face and jaws is formed by a mechanical process of skin folding and not genetics. The findings are detailed in a study published December 11 in the journal Nature.
Get the Popular Science newsletterBreakthroughs, discoveries, and DIY tips sent every weekday.
Email address Sign up Thank you!By signing up you agree to our Terms of Service and Privacy Policy.
Polka dot patternsTypically, animal skin appendages including hair, feathers, and scales are controlled by specific genes when an embryo is developing. There are some exceptions to this rule–including in crocodile heads.
“Crocodiles are beautiful animals with a bad reputation,” Michel Milinkovitch, a study co-author and physical biologist who helps lead the Laboratory of Artificial & Natural Evolution at the University of Geneva in Switzerland, tells Popular Science. “They are remarkable beasts for multiple reasons. One of them is that they form the sister group (i.e., are the closest relatives) of birds and dinosaurs.”
According to Milinkovitch, the fact that their body scales and head scales develop so differently also sets them apart as animals. Crocodile body scales develop from what scientists call a polka dot pattern of gene expression during the embryo’s development. This is when particular genes turn on in specific and localized regions of tissue or an organ.
[Related: Tracing the crocodiles’ curious evolutionary family tree.]
“So, at each spot of high gene expression, cells become fated to form a skin appendage–a hair, a feather or a scale, depending on the species,” explains Milinkovitch.
However, crocodile head scales are a bit different from their body scales. While taking a blood sample from a Nile crocodile, Milinkovitch was struck by this unusual pattern of scales on its jaws and face, where some of the polygons had unconnected edges. This unusual pattern couldn’t really be explained by the primary genetic understanding of how sales form.
Milinkovitch suspected that a mechanical process was at play–and not genetics. Determining the precise mechanism has eluded scientists, as crocodile embryos are pretty difficult to come by.
Fold in the skinIt took over 10 years for Milinkovitch and his colleagues to gather enough embryos to conduct this new study. Once the team had several embryos to work with, they then combined experiments with the embryos and computer simulations to generate a 3D mechanical growth model that details the patterning of head scales on a crocodile.
They discovered that the scales are self-organizing through some familiar mechanical processes including compressive folding. This folding begins when the skin is growing faster than the underlying bone and when the skin itself becomes either more elastic or stiff. This folding and physical change then produces the irregular geometric patterns in the head scales as the crocodile grows in a distinctly different way.
A newborn Nile crocodile with the upper jaw scanned with light-sheet microscopy to reveal the fine folds generated by the self-organised mechanical process of head-scale patterning uncovered in our study. CREDIT: G. Timin & M. C. Milinkovitch—University of Geneva, Switzerland“Our mathematical model demonstrates that the very different patterns of head scales in different species of crocodilians are easily obtained by changing slightly the mechanical properties of the skin,” says Milinkovitch. “Hence, one does not need to call for many genes being modified to explain the evolution of head scale patterns in crocodilians: small evolutionary changes of the growth and mechanical properties of the skin explains it all.”
[Related: Say hello to the surprising crocodile relative Benggwigwishingasuchus eremicarminis.]
Staining collagenIn order to pinpoint these mechanics, the team also needed to develop new ways of staining collagen. This protein builds skin, bone, ligaments, tendons, and other connective tissues and is also crucial for protecting the body’s organs among other biological roles. That new staining technique helped Milinkovitch see what mechanical properties collagen is giving to the skin and understanding how the collagen in the skin works has some applications outside of crocodiles.
“Our technique is now being picked-up by many researchers because the 3D architecture of collagen is very very important for understanding invasiveness of cancer tumours as well as for understaffing the aging of the skin,” says Milinkovitch.
Developing new methods for something as wide-reaching as collagen staining and figuring out what is going on underneath the skin of something like a crocodile also points to an often overlooked part of embryonic development.
“It [mechanics] plays a huge role in embryonic development and researchers in biology and physics are slowly realising this,” says Milinkovitch. “Stay curious, look around you, there are so many aspects of the living world we do not understand.”
The post Why are crocodiles so bumpy? A dermatological mystery has been solved appeared first on Popular Science.
Scientists figured out the optimal cup of coffee - Popular Science
When coffee fans debate the optimal brew, the argument often centers on the relationship between roast strategies and caffeine levels. But after a recent gauntlet of laboratory tests, researchers believe they identified the blend of factors comprising the perfect cup balancing flavor and zing—and dark roast diehards may want to finish taking a sip before finding out the results.
“Over 20 years ago, I heard a barista claim that dark roasts have more caffeine, but a decade later, I was exposed to the contrasting idea that light roasts were the king of caffeine. Yet, I couldn’t find any convincing data,” Zachary Lindsey, an assistant professor of physics at Georgia’s Berry College, said in a statement.
Get the Popular Science newsletterBreakthroughs, discoveries, and DIY tips sent every weekday.
Email address Sign up Thank you!By signing up you agree to our Terms of Service and Privacy Policy.
Lindsey’s team recently analyzed the relationship between coffee beans’ chemical and physical attributes through various roasting and brewing scenarios. While their trial focused on only natural and washed processed Ethiopian beans, researchers examined five varieties of roasts across brew times of one, two, and ten minutes using a machine with a 15:1 water-to-coffee ratio.
“When selecting a brew method, the main goal was to implement a procedure that could consistently produce brews within a wide range of extraction yields by only varying the brew time,” said Lindsey.
The results, published in the journal Scientific Reports, offer a detailed look at 30 unique coffee combinations on microscopic and chemical levels. Lindsey and colleagues used high-performance liquid chromatography (HPLC) to assess molecular makeup of soluble compounds like caffeine and chlorogenic acids. In this process, a brew’s compounds are separated according to their interactions with a standard material to measure concentration. Scanning electron microscopy (SEM) imaged whole and ground beans to provide observers with a close look at porosity and grain size, thereby better highlighting how roasting physically affects coffee.
SEM images showing porosity evolution from green coffee (G) through R0-R7 roast batches of washed Ethiopian coffee. Credit: Scientific ReportsFor the third and final tool, the team turned to refractometry—the measurement of light bending—to learn brewing extraction yields. According to the study, attributes such as caffeine content are the result of intricate relationships between how coffee is roasted, and how well its organic compounds can dissolve into water.
As Lindsey explained, “During roasting, the volume and porosity of the coffee seeds increase as the roast progresses, which makes it easier for compounds to move in or out of the system.”
For example, porosity, or the measure of a coffee ground’s void spaces, grows due to longer grinding times. This means that a larger amount of each coffee ground’s internal surface area is exposed to water. But when it comes to caffeine, light and medium roasts on average measured higher levels than darker variants across the roasting spectrum samples. Lindsey’s team explained that this is due to the amount of caffeine that is lost during the process. Darker roasts, however, maintained higher caffeine levels than lighter roasts when porosity and extraction yields were uniform across all varieties.
“Although the interplay between roast degree and caffeine content has been addressed over 20 times in the literature, the prevailing theory is that caffeine remains stable during the roasting process,” Lindsey argues. “However, we establish a clear relationship between roast degree, caffeine content, and extraction yield.”
Like most preferences, your ideal coffee brew is ultimately a matter of personal taste (or access to ultrasonic frequencies). But if your main goal at the end of the day is to get a much-needed energy boost, Lindsey suggests a medium roast blend for the most caffeine per cup.
The post Scientists figured out the optimal cup of coffee appeared first on Popular Science.
'M4 Extreme' Chip Unlikely After Apple 'Cancels' High-Performance Chip - MacRumors
Based on the report's description of the chip, it sounds like Apple has canceled a previously-rumored "Extreme" chip for the Mac. It was previously reported that an "M2 Extreme" chip was scrapped a few years ago, but perhaps Apple had revisited the idea since then. In any case, it now sounds like an "M4 Extreme" chip is also unlikely.
Apple likely would have introduced the "M4 Extreme" in its high-end Mac Pro tower. The chip would have offered even faster performance than the M4 Ultra chip that is expected to launch in new Mac Studio and Mac Pro models later next year.
If the "M4 Extreme" were to have been a quadrupled version of the M4 Max chip that debuted in the MacBook Pro a few months ago, it would have had massive specifications, including up to a 64-core CPU and up to a 160-core GPU.
While the "Extreme" chip may be off the table once again, it seems like Apple has repeatedly shown interest in developing such a chip, so perhaps it will eventually materialize as part of the M5 series or later. For now, though, the wait continues.Related Roundup: Mac ProTag: The InformationBuyer's Guide: Mac Pro (Neutral)Related Forum: Mac Pro
This article, "'M4 Extreme' Chip Unlikely After Apple 'Cancels' High-Performance Chip" first appeared on MacRumors.com
Discuss this article in our forums
'M4 Extreme' Chip Unlikely After Apple 'Cancels' High-Performance Chip - MacRumors
Based on the report's description of the chip, it sounds like Apple has canceled a previously-rumored "Extreme" chip for the Mac. It was previously reported that an "M2 Extreme" chip was scrapped a few years ago, but perhaps Apple had revisited the idea since then. In any case, it now sounds like an "M4 Extreme" chip is also unlikely.
Apple likely would have introduced the "M4 Extreme" in its high-end Mac Pro tower. The chip would have offered even faster performance than the M4 Ultra chip that is expected to launch in new Mac Studio and Mac Pro models later next year.
If the "M4 Extreme" were to have been a quadrupled version of the M4 Max chip that debuted in the MacBook Pro a few months ago, it would have had massive specifications, including up to a 64-core CPU and up to a 160-core GPU.
While the "Extreme" chip may be off the table once again, it seems like Apple has repeatedly shown interest in developing such a chip, so perhaps it will eventually materialize as part of the M5 series or later. For now, though, the wait continues.Related Roundup: Mac ProTag: The InformationBuyer's Guide: Mac Pro (Neutral)Related Forum: Mac Pro
This article, "'M4 Extreme' Chip Unlikely After Apple 'Cancels' High-Performance Chip" first appeared on MacRumors.com
Discuss this article in our forums
Cloud-Based M4 and M4 Pro Mac Mini Models Now Available - MacRumors
The company has launched three configurations of the new Mac mini, powered by Apple's M4 and M4 Pro chips. Developers and IT teams can rent these machines for tasks ranging from basic development to advanced artificial intelligence modeling, providing an efficient and scalable infrastructure option without the need to purchase expensive hardware outright. A new Mac mini can range from $599 for a base M4 model to $1,999 for a high-end M4 Pro model with 64GB of unified memory.
The three configurations include the MacWeb Base M4 at $99 per month, the MacWeb Power M4 Pro at $199 per month, and the MacWeb Ultimate M4 Pro at $299 per month. The Base M4, featuring the standard M4 chip, is designed for virtual desktops and small-scale tasks. The Power M4 Pro includes a 12-core CPU and 24GB of unified memory, making it suitable for application development and testing. The Ultimate M4 Pro, MacWeb's most advanced tier, offers a 14-core CPU, a 20-core GPU, and 64GB of unified memory, capable of handling intensive workloads such as AI model training and mission-critical applications.
MacWeb touts the potential of its M4 Pro configurations to support advanced networking capabilities using Thunderbolt 5. According to the company, Thunderbolt 5 delivers 80 Gbps of bi-directional bandwidth, a performance leap described as being up to 800 percent faster than 10G Ethernet. This apparently enables seamless clustering of Mac minis, allowing users to pool resources for distributed computing tasks, including video editing and large-scale software testing.
Companies like AWS has offered similar services in recent years, but MacWeb's integration of Apple's latest Mac hardware positions it at the forefront of the market, along with MacStadium. MacWeb has retained its M2-based offerings for developers with less demanding performance requirements. Related Roundup: Mac miniTags: M4, M4 Mac miniBuyer's Guide: Mac Mini (Buy Now)Related Forum: Mac mini
This article, "Cloud-Based M4 and M4 Pro Mac Mini Models Now Available" first appeared on MacRumors.com
Discuss this article in our forums
Cloud-Based M4 and M4 Pro Mac Mini Models Now Available - MacRumors
The company has launched three configurations of the new Mac mini, powered by Apple's M4 and M4 Pro chips. Developers and IT teams can rent these machines for tasks ranging from basic development to advanced artificial intelligence modeling, providing an efficient and scalable infrastructure option without the need to purchase expensive hardware outright. A new Mac mini can range from $599 for a base M4 model to $1,999 for a high-end M4 Pro model with 64GB of unified memory.
The three configurations include the MacWeb Base M4 at $99 per month, the MacWeb Power M4 Pro at $199 per month, and the MacWeb Ultimate M4 Pro at $299 per month. The Base M4, featuring the standard M4 chip, is designed for virtual desktops and small-scale tasks. The Power M4 Pro includes a 12-core CPU and 24GB of unified memory, making it suitable for application development and testing. The Ultimate M4 Pro, MacWeb's most advanced tier, offers a 14-core CPU, a 20-core GPU, and 64GB of unified memory, capable of handling intensive workloads such as AI model training and mission-critical applications.
MacWeb touts the potential of its M4 Pro configurations to support advanced networking capabilities using Thunderbolt 5. According to the company, Thunderbolt 5 delivers 80 Gbps of bi-directional bandwidth, a performance leap described as being up to 800 percent faster than 10G Ethernet. This apparently enables seamless clustering of Mac minis, allowing users to pool resources for distributed computing tasks, including video editing and large-scale software testing.
Companies like AWS has offered similar services in recent years, but MacWeb's integration of Apple's latest Mac hardware positions it at the forefront of the market, along with MacStadium. MacWeb has retained its M2-based offerings for developers with less demanding performance requirements. Related Roundup: Mac miniTags: M4, M4 Mac miniBuyer's Guide: Mac Mini (Buy Now)Related Forum: Mac mini
This article, "Cloud-Based M4 and M4 Pro Mac Mini Models Now Available" first appeared on MacRumors.com
Discuss this article in our forums
Apple Watch Series 10 Available for Black Friday Prices and Christmas Delivery on Amazon - MacRumors
Note: MacRumors is an affiliate partner with Amazon. When you click a link and make a purchase, we may receive a small payment, which helps us keep the site running.
You'll find $69 off both 42mm and 46mm GPS Series 10 models in multiple case colors and band styles, all requiring an on-page coupon in order to see the discount at checkout. Every price listed below is a match of the record low prices we saw for Black Friday. If you've been eyeing an Apple Watch Series 10 as a present, now will be the time to purchase one at these record low prices.
Note: You won't see the deal price until checkout.
$69 OFFApple Watch Series 10 (42mm GPS) for $329.99
$69 OFFApple Watch Series 10 (46mm GPS) for $359.99
If you're also shopping for a few new Apple Watch bands, Woot is hosting a big sale this week on Solo Loop and Braided Solo Loop bands. You'll find Solo Loop bands at $19.99 ($29 off) and Braided Solo Loop bands at $29.99 ($69 off) during this event, making them both the lowest we've ever tracked for each style in brand new condition.
42mm GPS Apple Watch Series 10
- Jet Black Aluminum Case with Black Sport Band (S/M) - $329.99 with on-page coupon, down from $399.00
- Rose Gold Aluminum Case with Light Blush Sport Band (S/M) - $329.99 with on-page coupon, down from $399.00
46mm GPS Apple Watch Series 10
- Jet Black Aluminum Case with Black Sport Band (M/L) - $359.99 with on-page coupon, down from $429.00
- Jet Black Aluminum Case with Black Sport Band (S/M) - $359.99 with on-page coupon, down from $429.00
- Jet Black Aluminum Case with Ink Sport Loop - $359.99 with on-page coupon, down from $429.00
- Rose Gold Aluminum Case with Light Blush Sport Band (M/L) - $359.99 with on-page coupon, down from $429.00
- Rose Gold Aluminum Case with Plum Sport Loop - $359.99 with on-page coupon, down from $429.00
- Silver Aluminum Case with Blue Cloud Sport Loop - $359.99 with on-page coupon, down from $429.00
- Silver Aluminum Case with Denim Sport Band (S/M) - $359.99 with on-page coupon, down from $429.00
- Silver Aluminum Case with Denim Sport Band (M/L) - $359.99 with on-page coupon, down from $429.00
Deals Newsletter
Interested in hearing more about the best deals you can find during the holiday season? Sign up for our Deals Newsletter and we'll keep you updated so you don't miss the biggest deals of the season!
Related Roundup: Apple Deals
This article, "Apple Watch Series 10 Available for Black Friday Prices and Christmas Delivery on Amazon" first appeared on MacRumors.com
Discuss this article in our forums