App Development Technical Docs
Haptic Module
Touch Strategies Across Mini-Games
Harmony With Me – Thick Pulse Feedback
User taps are paired with dense, higher-amplitude haptic pulses, giving a sense of weight and reinforcement to each rhythmic interaction.
This aligns the felt sensation with the sonic and visual beat events, strengthening the perception of stability and presence.
Feel Your Vibe – Frequency-Adaptive Vibration
Haptic frequency mirrors audio tremolo frequency:
continuous pulses at low LFO rates; sharper and more transient patterns at higher rates.
This creates a tactile rhythm texture that reflects the LFO’s temporal structure, allowing users to “feel” rhythm variance.
Touch for Music – Universal Tactile Presence
Every user interaction triggers tactile feedback—though its technical simplicity masks a design choice:
The constant tactile response works as a somatic anchor, linking every gesture with a physical sensation and therefore increasing the sense of agency and rhythm embodiment.
HapticManager Enum
The HapticManager enum is a utility for managing haptic feedback on devices that support Core Haptics. It provides various methods to trigger different types of haptic feedback patterns. Below is a detailed explanation of each haptic mode and the corresponding algorithm used to create the vibration patterns.
enum HapticManager {
static nonisolated(unsafe) private var engine: CHHapticEngine?
static func prepareHaptics()
static private func restartEngine()
static func triggerNormalHaptic()
static func triggerMultiPattern(frequency: Double, amplitude: Double) async
static func triggerGradualHaptic(intensity: Float = 0.6)
static func triggerHeavyHaptic()
}
Methods
prepareHaptics()
This method initializes the haptic engine if the device supports haptics. It starts the engine and prints a message indicating the engine has started.
restartEngine()
This private method attempts to restart the haptic engine if it fails during operation.
triggerNormalHaptic()
This method triggers a single, short haptic tap. It creates a transient haptic event with moderate intensity and low sharpness.
Algorithm:
- Check if the haptic engine is available.
- Create haptic event parameters for intensity (0.4) and sharpness (0.2).
- Create a transient haptic event with the parameters.
- Create a haptic pattern with the event.
- Create a player for the pattern and start it.
triggerMultiPattern(frequency: Double, amplitude: Double) async
This method triggers a haptic pattern based on the provided frequency and amplitude. It differentiates between low and high frequencies to create continuous or transient patterns.
Note: This Pattern is designed for Feel Your Vibe game specifically.
Algorithm:
- Check if the haptic engine is available.
- For low frequencies (<= 2.0):
- Create continuous haptic event parameters for intensity (amplitude * 0.7) and sharpness (0.3).
- Create a continuous haptic event with a duration of 0.2 seconds.
- For high frequencies (> 2.0):
- Create transient haptic event parameters for intensity (amplitude * 0.6) and sharpness (0.5).
- Create a transient haptic event.
- Create a haptic pattern with the event.
- Create a player for the pattern and start it.
triggerGradualHaptic(intensity: Float = 0.6)
This method triggers a haptic pattern that gradually increases and then decreases in intensity over a duration of 2 seconds.
Algorithm:
- Check if the device supports haptics and if the haptic engine is available.
- Create a series of continuous haptic events with varying intensity to form a bell curve.
- Each event has a duration of 0.1 seconds and the intensity peaks in the middle.
- Create a haptic pattern with the events.
- Create a player for the pattern, start the engine, and start the player.
triggerHeavyHaptic()
This method triggers a haptic pattern that starts with strong intensity and gradually decreases.
Algorithm:
- Check if the haptic engine is available.
- Create a series of continuous haptic events with decreasing intensity.
- Each event has a duration of 0.2 seconds.
- Create a haptic pattern with the events.
- Create a player for the pattern and start it.
Development - Harmony With Me
I. Game Philosophy
The game is an interactive music experience where users create harmonious sounds by tapping a circle that appears at random positions on the screen. The game features background music, haptic feedback, and inactivity monitoring to enhance the user experience.
II. Key Components
-
File Management
- Uses
FileSystemManagerfor audio resource handling - Checks for essential files like "testBGM.m4a"
- Uses
-
Audio System
- Managed through
AudioManager - Handles background music and harmony sounds
- Managed through
-
User Interaction
- Uses
DragGesturefor touch detection - Implements circle-based touch area detection
- Includes haptic feedback via
HapticManager
- Uses
-
State Management
- Tracks visibility state (
showHomeButton) - Monitors gameplay state
- Handles cleanup on view disappear
- Tracks visibility state (
III. Main View Structure
-
Imports and Declarations
- Import necessary frameworks:
SwiftUI,AVFoundation,CoreHaptics - Define
MusicDelegateclass for handling background music looping
- Import necessary frameworks:
-
Main View:
HarmonyWithMe- State properties for managing game state, audio players, timers, and UI elements
- Constants for game configuration (e.g., circle duration, sound groups)
-
bodyViewNavigationStackwithGeometryReaderfor responsive layoutZStackfor layering UI elements:- Background color
- Instruction text
- Interactive circle
- Home button
-
Lifecycle Handlers
.onAppearto initialize game components and start background music.gestureto handle touch interactions.onDisappearto clean up resources
-
Helper Methods
startBackgroundMusic(): Initialize and start background music with loopinginitializeGame(): Set up initial game state and start timersplayRandomSoundFromCurrentGroup(): Play a random sound from the current sound groupisPlayingHarmonySound(): Check if a harmony sound is currently playingisTouchInsideCircle(at:): Determine if a touch is inside the interactive circleshowCircleAtRandomPosition(): Display the interactive circle at a random positionstartMusicTimer(): Monitor background music playback positioncheckMusicPosition(): Check if it's time to show the interactive circlestartInactivityTimer(): Start a timer to monitor user inactivitystopInactivityTimer(): Stop the inactivity timerresetInactivityTimer(): Reset the inactivity timer and hide the home buttoncheckForInactivity(): Check if the user has been inactive for a certain periodresetGame(): Reset the game statefadeOutAndClean(): Fade out audio and clean up resources
IV. The Self-defiend Class MusicDelegate
The MusicDelegate class is a custom delegate for handling background music playback events in the HarmonyWithMe game. It conforms to the AVAudioPlayerDelegate protocol and is primarily responsible for detecting when the background music has finished playing and triggering appropriate actions to reset the game state and loop the music.
A. How It Works
-
Initialization
- The
MusicDelegateis initialized with a closure (onMusicLoop) that defines the actions to be taken when the background music finishes playing. - This closure is stored in the
onMusicLoopproperty.
- The
-
Conforming to
AVAudioPlayerDelegate- The
MusicDelegateclass conforms to theAVAudioPlayerDelegateprotocol, which requires the implementation of theaudioPlayerDidFinishPlaying(_:successfully:)method.
- The
-
Handling Music Playback Completion
- The
audioPlayerDidFinishPlaying(_:successfully:)method is called automatically by theAVAudioPlayerinstance when the music finishes playing. - The method checks if the playback finished successfully using the
flagparameter. - If the playback finished successfully, the method prints a debug message and calls the
onMusicLoopclosure to reset the game and restart the music.
- The
B. Usage in the main view
The MusicDelegate is used in the HarmonyWithMe view to handle background music looping and game state resetting.
@State private var musicDelegate: MusicDelegate?
private func startBackgroundMusic() {
backgroundPlayer = AudioManager.createAudioPlayer(filename: "testBGM", fileExtension: "m4a")
guard let player = backgroundPlayer else {
print("❌ Failed to create background player")
return
}
AudioManager.fadeIn(player)
musicDelegate = MusicDelegate {
DispatchQueue.main.async {
self.resetGame()
self.startMusicTimer()
}
}
player.delegate = musicDelegate
player.numberOfLoops = -1
}
-
Creating the Audio Player
- The
startBackgroundMusic()method creates anAVAudioPlayerinstance for the background music.
- The
-
Setting the Delegate
- A
MusicDelegateinstance is created with a closure that resets the game and restarts the music timer. - The
musicDelegateproperty is assigned this instance. - The
AVAudioPlayerinstance'sdelegateproperty is set to themusicDelegate.
- A
-
Handling Music Looping
- When the background music finishes playing, the
audioPlayerDidFinishPlaying(_:successfully:)method of theMusicDelegateis called. - The
onMusicLoopclosure is executed, which resets the game state and restarts the music timer.
- When the background music finishes playing, the
The MusicDelegate class is a crucial component in the HarmonyWithMe game, ensuring that the background music loops seamlessly and the game state is reset appropriately. By conforming to the AVAudioPlayerDelegate protocol and using a closure to define the actions on music completion, it provides a flexible and efficient way to manage background music playback and game state transitions.
Development - Feel Your Vibe
I. Game Philosophy
"Feel Your Vibe" is an interactive music game that allows users to dynamically control tremolo effects using intuitive swipe gestures. Vertical swipes adjust the tremolo frequency (speed), while horizontal movements control the tremolo depth (amplitude). The game features a custom Low-Frequency Oscillator (LFO) and frequency-adaptive haptic feedback for a rich sensory experience.
II. Architecture and Components
The game is built using a modular design, leveraging shared components and game-specific logic:
- Audio System:
AudioManager: Manages audio session configuration, playback, and volume control.LFO: A custom class responsible for generating the tremolo effect.
- User Interaction:
DragGesture: Detects and processes swipe gestures.- Screen Center-crossing detection: Ensures accurate timing for frequency changes based on vertical swipes.
- Haptic Feedback:
HapticManagerprovides tactile feedback synchronized with the tremolo effect. - State Management: Manages playback state, LFO parameters (frequency and amplitude), and UI visibility.
III. Core Functionality
A. LFO (Low-Frequency Oscillator)
The LFO generates a sine wave to modulate the audio volume, creating the tremolo effect.
- Class Structure:
class LFO {
var frequency: Double = 1.0 // Oscillation speed (Hz)
var amplitude: Double = 0.1 // Oscillation depth (0-1)
private var phase: Double = 0.0
func update(deltaTime: TimeInterval) -> Double {
phase += 2 * .pi * frequency * deltaTime
return amplitude * sin(phase)
}
}
- Key Functionality:
- Wave Generation: Uses a sine wave function for smooth oscillation.
- Parameter Control:
- Frequency: Ranges from 0.1 Hz to 10.0 Hz.
- Amplitude: Ranges from 0.0 (no effect) to 1.0 (maximum modulation).
- Time Management: Uses
deltaTimefor precise phase calculation, ensuring consistent tremolo regardless of frame rate.
B. Gesture Handling and Parameter Mapping
Swipe gestures are processed to control the LFO parameters:
-
Vertical Swipes (Frequency Control):
- Screen Center Crossing Detection: Tracks when the swipe crosses the vertical midpoint of the screen.
- Frequency Calculation: Calculates the time between center crossings to determine the swipe speed and map it to the LFO frequency. A moving average is used for smoothing.
- Frequency Range: Constrained between 0.1 Hz and 10 Hz.
-
Horizontal Movements (Amplitude Control):
- Position Mapping: Maps the horizontal touch position to the LFO amplitude. Left side corresponds to minimum amplitude, right side to maximum.
- Amplitude Scaling:
amplitude = (position.x / screenWidth) * (maxAmp - minAmp) + minAmp
-
LFO Update: The calculated frequency and amplitude values are applied to the LFO instance in each update cycle.
C. Haptic Feedback
The haptic feedback is frequency-adaptive:
- Low Frequency (≤ 2 Hz): Continuous haptic patterns with longer duration and higher intensity.
- High Frequency (> 2 Hz): Short, sharp pulses with lower intensity.
- Gesture Response: Haptic feedback is triggered on screen center crossing, with intensity varying based on swipe speed.
IV. Game Flow and Lifecycle
A. Game Flow
The main game loop is as follows:
- Initialization: Audio, LFO, and haptic systems are initialized.
- User Input (Swipe): Gestures are detected and processed.
- Parameter Update: LFO frequency and amplitude are updated based on swipe data.
- Audio Modulation: The LFO output modulates the audio volume.
- Haptic Feedback: Haptic patterns are triggered based on the LFO frequency.
- Repeat Steps 2-5.
B. Lifecycle Methods
onAppear:- Configures the audio session.
- Prepares the haptic engine.
- Initializes the background music player and LFO.
- Starts inactivity monitoring.
- Gesture Handling: Continuously monitors
DragGesturechanges, updates LFO parameters, and resets the inactivity timer. onDisappear:- Fades out the audio.
- Stops and cleans up the LFO.
- Invalidates timers.
- Releases haptic resources.
- Cleans up the audio session.
V. Key Features
- Dynamic Tremolo Control: Intuitive gesture-based control of tremolo frequency and depth.
- Custom LFO: Efficient sine wave generation for smooth tremolo effects.
- Frequency-Adaptive Haptics: Enhanced tactile feedback synchronized with audio changes.
- Minimalist UI: Focuses on the core interactive experience.
VI. State Variables (Examples)
lfo: The LFO instance.currentFrequency: The current LFO frequency.currentAmplitude: The current LFO amplitude.lastTouchLocation: The last recorded touch position.lastTouchTime: The timestamp of the last touch event.
Development - Touch for Music
I. Game Philosophy
"Touch for Music" is an interactive music game where users control music volume by touching and holding anywhere on the screen. A dynamic pulsing circle provides visual feedback synchronized with the audio changes. The game focuses on intuitive touch interaction and a minimalist aesthetic.
II. Architecture and Components
- Audio System:
AudioManager: Manages audio session configuration, playback, and smooth volume transitions.
- Visual Feedback:
- Custom circle animation system using SwiftUI animations.
- User Interaction:
DragGesture: Detects and tracks touch location and state (began, changed, ended).
- State Management: Manages touch state, circle visibility, and inactivity timeout.
III. Core Functions
A. Circle Animation
The circle animation provides visual feedback synchronized with the audio volume.
- Implementation: Uses SwiftUI's
@Stateproperties and animations:
struct TouchForMusic: View {
@State private var circleScale: CGFloat = 1.0
@State private var circleOpacity: Double = 0.0
@State private var touchLocation: CGPoint?
@State private var isTouching: Bool = false // Tracks touch state
// Example animation:
.scaleEffect(isTouching ? 1.2 : 1.0) // Scale up when touching
.opacity(isTouching ? 1.0 : 0.5) // Change opacity based on touch state
.animation(.easeInOut(duration: 0.2)) // Smooth animation
// ...
}
- Key Functionality:
- Touch-Responsive Scaling and Opacity: The circle scales up slightly and becomes more opaque when the user touches the screen, providing immediate feedback.
- Smooth Transitions:
easeInOutanimations are used for smooth transitions between states. - Position Tracking: The circle's position is updated to follow the touch location using the
touchLocationstate.
B. Touch Handling and Volume Control
The DragGesture is used to handle touch input and control the volume.
-
Gesture Handling:
- The
onChangedhandler of theDragGestureupdates thetouchLocationstate and calculates the volume based on the vertical touch position. - The
onEndedhandler resets thetouchLocationand performs any necessary cleanup.
- The
-
Volume Mapping:
- The vertical touch position is mapped to a volume range (e.g., 0.0 to 1.0). Touching higher on the screen increases the volume, while touching lower decreases it.
- A scaling function is used to ensure a smooth and proportional volume change.
// Example volume calculation:
let screenHeight = UIScreen.main.bounds.height
let touchY = touchLocation?.y ?? screenHeight / 2 // Default to center if no touch
let volume = 1.0 - (touchY / screenHeight) // Invert so higher touch = higher volume
audioManager.setVolume(Float(volume)) // Assuming AudioManager has a setVolume function
C. Inactivity Monitoring
A timer is used to detect user inactivity.
- Implementation: A
Timeris scheduled to fire periodically (e.g., every 1 second). - Timeout: If no touch events are detected within a predefined timeout period (e.g., 10 seconds), the home button is displayed.
- Reset: Any touch event resets the inactivity timer.
IV. UI Elements
- Pulsing Circle: The primary visual element, providing feedback on touch interaction and volume changes.
- Home Button: Appears after inactivity, allowing users to navigate back to the main menu.
- Initial Instructions: Brief instructions on how to use the game.
V. Lifecycle Methods
onAppear:- Configures the audio session.
- Starts background music playback.
- Starts inactivity monitoring.
onChanged(DragGesture):- Updates
touchLocation. - Calculates and sets the audio volume.
- Updates circle animation.
- Resets inactivity timer.
- Updates
onEnded(DragGesture):- Resets
touchLocationand any relevant states.
- Resets
onDisappear:- Fades out audio.
- Invalidates timers.
- Cleans up resources.
VI. State Variables Examples
circleScale: The current scale of the circle.circleOpacity: The current opacity of the circle.touchLocation: The current touch location.isTouching: A boolean indicating whether the user is currently touching the screen.inactivityTimer: The timer used for inactivity monitoring.


