Inside Apple’s Next UI Push: Liquid Glass, Tokens, and Cross-Tool Sync

Published March 30, 2026 · Jordan Kim

Swift Insider Header

TL;DR

  • Apple appears to be expanding its material system (“Liquid Glass”) across more UI surfaces
  • Early signals suggest a move toward first-class design tokens in SwiftUI
  • There are hints Apple may be exploring better integration with design tools like Figma
  • This could significantly reduce the gap between design and implementation

A Shift That’s Been Building for Years

Over the past few iOS releases, Apple has steadily expanded its material system in SwiftUI:


.background(.ultraThinMaterial)
  

What started as a way to add subtle depth has evolved into something more foundational. Materials are no longer just visual flourishes — they’re increasingly shaping layout, hierarchy, and accessibility.

Internally, some developers have started referring to this direction as “Liquid Glass” — a system where surfaces feel adaptive, layered, and responsive to their environment.

Why “Liquid Glass” Matters

The key advantage of system materials is that they adapt automatically:

  • Contrast adjusts based on background content
  • Transparency responds to accessibility settings
  • Colors shift between light and dark modes

In practice, this means developers can rely less on hardcoded visual decisions and more on system-provided behaviors.

However, this also introduces a new challenge: how do you ensure consistency across an app when so much of the UI is dynamic?

The Missing Piece: Design Tokens

For most teams, the answer has been design tokens — named values that represent things like colors, spacing, typography, and corner radius.

In SwiftUI today, implementing tokens typically looks something like this:


extension Color {
    static let primaryBackground = Color("PrimaryBackground")
}

struct Spacing {
    static let medium: CGFloat = 12
}
  

This works, but it’s entirely developer-defined. There’s no system-level understanding of what these values represent or how they should adapt.

Signals of a Change

In recent betas and sessions, there have been subtle hints that Apple may be exploring a more unified approach:

  • Increased emphasis on semantic styling over explicit values
  • More APIs that derive appearance from environment context
  • Internal naming patterns that suggest token-like abstractions

None of this confirms a formal system yet — but it does point toward a direction where styling becomes more declarative and system-driven.

Bridging Design and Code

One of the biggest friction points for teams today is the disconnect between design tools and implementation.

Designers define tokens in Figma, but translating those into SwiftUI is still a manual process. Even with automation scripts, the pipeline is fragile and often out of sync.

What This Could Mean for SwiftUI

If these pieces come together, SwiftUI could move toward a model where:

  • Materials handle environmental adaptation
  • Tokens define semantic intent
  • The system bridges the two automatically

Imagine something like:


.background(.surface(.primary))
.padding(.spacing(.medium))
  

Looking Ahead to WWDC

With WWDC just around the corner, it’s likely we’ll get more clarity on where Apple is heading.

Even if a full design token system isn’t announced this year, the continued investment in materials and semantic APIs suggests that Apple is moving toward a more unified UI model.

Final Thoughts

SwiftUI has always aimed to simplify UI development by focusing on intent over implementation.

“Liquid Glass,” design tokens, and cross-tool integration all point toward the same goal: reducing the gap between what we design and what we build.

If Apple delivers on even part of this vision, it could fundamentally change how teams approach UI architecture.

Previous
Previous

What We Know About SwiftUI on the Web - And Why It Could Change Everything