Tranquil Heart Development Documentation
Table of Contents

App Development Technical Docs

Haptic Module

Touch Strategies Across Mini-Games

Harmony With Me – Thick Pulse Feedback
User taps are paired with dense, higher-amplitude haptic pulses, giving a sense of weight and reinforcement to each rhythmic interaction.
This aligns the felt sensation with the sonic and visual beat events, strengthening the perception of stability and presence.

Feel Your Vibe – Frequency-Adaptive Vibration
Haptic frequency mirrors audio tremolo frequency:
continuous pulses at low LFO rates; sharper and more transient patterns at higher rates.
This creates a tactile rhythm texture that reflects the LFO’s temporal structure, allowing users to “feel” rhythm variance.

Touch for Music – Universal Tactile Presence
Every user interaction triggers tactile feedback—though its technical simplicity masks a design choice:
The constant tactile response works as a somatic anchor, linking every gesture with a physical sensation and therefore increasing the sense of agency and rhythm embodiment.

HapticManager Enum

The HapticManager enum is a utility for managing haptic feedback on devices that support Core Haptics. It provides various methods to trigger different types of haptic feedback patterns. Below is a detailed explanation of each haptic mode and the corresponding algorithm used to create the vibration patterns.

enum HapticManager {
    static nonisolated(unsafe) private var engine: CHHapticEngine?

    static func prepareHaptics()
    static private func restartEngine()
    static func triggerNormalHaptic()
    static func triggerMultiPattern(frequency: Double, amplitude: Double) async
    static func triggerGradualHaptic(intensity: Float = 0.6)
    static func triggerHeavyHaptic()
}

Methods

prepareHaptics()
This method initializes the haptic engine if the device supports haptics. It starts the engine and prints a message indicating the engine has started.

restartEngine()
This private method attempts to restart the haptic engine if it fails during operation.

triggerNormalHaptic()
This method triggers a single, short haptic tap. It creates a transient haptic event with moderate intensity and low sharpness.

Algorithm:

  1. Check if the haptic engine is available.
  2. Create haptic event parameters for intensity (0.4) and sharpness (0.2).
  3. Create a transient haptic event with the parameters.
  4. Create a haptic pattern with the event.
  5. Create a player for the pattern and start it.

triggerMultiPattern(frequency: Double, amplitude: Double) async
This method triggers a haptic pattern based on the provided frequency and amplitude. It differentiates between low and high frequencies to create continuous or transient patterns.

Note: This Pattern is designed for Feel Your Vibe game specifically.

Algorithm:

  1. Check if the haptic engine is available.
  2. For low frequencies (<= 2.0):
    • Create continuous haptic event parameters for intensity (amplitude * 0.7) and sharpness (0.3).
    • Create a continuous haptic event with a duration of 0.2 seconds.
  3. For high frequencies (> 2.0):
    • Create transient haptic event parameters for intensity (amplitude * 0.6) and sharpness (0.5).
    • Create a transient haptic event.
  4. Create a haptic pattern with the event.
  5. Create a player for the pattern and start it.

triggerGradualHaptic(intensity: Float = 0.6)
This method triggers a haptic pattern that gradually increases and then decreases in intensity over a duration of 2 seconds.

Algorithm:

  1. Check if the device supports haptics and if the haptic engine is available.
  2. Create a series of continuous haptic events with varying intensity to form a bell curve.
  3. Each event has a duration of 0.1 seconds and the intensity peaks in the middle.
  4. Create a haptic pattern with the events.
  5. Create a player for the pattern, start the engine, and start the player.

triggerHeavyHaptic()
This method triggers a haptic pattern that starts with strong intensity and gradually decreases.

Algorithm:

  1. Check if the haptic engine is available.
  2. Create a series of continuous haptic events with decreasing intensity.
  3. Each event has a duration of 0.2 seconds.
  4. Create a haptic pattern with the events.
  5. Create a player for the pattern and start it.

Development - Harmony With Me

I. Game Philosophy

The game is an interactive music experience where users create harmonious sounds by tapping a circle that appears at random positions on the screen. The game features background music, haptic feedback, and inactivity monitoring to enhance the user experience.

II. Key Components

  1. File Management

    • Uses FileSystemManager for audio resource handling
    • Checks for essential files like "testBGM.m4a"
  2. Audio System

    • Managed through AudioManager
    • Handles background music and harmony sounds
  3. User Interaction

    • Uses DragGesture for touch detection
    • Implements circle-based touch area detection
    • Includes haptic feedback via HapticManager
  4. State Management

    • Tracks visibility state (showHomeButton)
    • Monitors gameplay state
    • Handles cleanup on view disappear

III. Main View Structure

  1. Imports and Declarations

    • Import necessary frameworks: SwiftUI, AVFoundation, CoreHaptics
    • Define MusicDelegate class for handling background music looping
  2. Main View: HarmonyWithMe

    • State properties for managing game state, audio players, timers, and UI elements
    • Constants for game configuration (e.g., circle duration, sound groups)
  3. body View

    • NavigationStack with GeometryReader for responsive layout
    • ZStack for layering UI elements:
      • Background color
      • Instruction text
      • Interactive circle
      • Home button
  4. Lifecycle Handlers

    • .onAppear to initialize game components and start background music
    • .gesture to handle touch interactions
    • .onDisappear to clean up resources
  5. Helper Methods

    • startBackgroundMusic(): Initialize and start background music with looping
    • initializeGame(): Set up initial game state and start timers
    • playRandomSoundFromCurrentGroup(): Play a random sound from the current sound group
    • isPlayingHarmonySound(): Check if a harmony sound is currently playing
    • isTouchInsideCircle(at:): Determine if a touch is inside the interactive circle
    • showCircleAtRandomPosition(): Display the interactive circle at a random position
    • startMusicTimer(): Monitor background music playback position
    • checkMusicPosition(): Check if it's time to show the interactive circle
    • startInactivityTimer(): Start a timer to monitor user inactivity
    • stopInactivityTimer(): Stop the inactivity timer
    • resetInactivityTimer(): Reset the inactivity timer and hide the home button
    • checkForInactivity(): Check if the user has been inactive for a certain period
    • resetGame(): Reset the game state
    • fadeOutAndClean(): Fade out audio and clean up resources

IV. The Self-defiend Class MusicDelegate

The MusicDelegate class is a custom delegate for handling background music playback events in the HarmonyWithMe game. It conforms to the AVAudioPlayerDelegate protocol and is primarily responsible for detecting when the background music has finished playing and triggering appropriate actions to reset the game state and loop the music.

A. How It Works

  1. Initialization

    • The MusicDelegate is initialized with a closure (onMusicLoop) that defines the actions to be taken when the background music finishes playing.
    • This closure is stored in the onMusicLoop property.
  2. Conforming to AVAudioPlayerDelegate

    • The MusicDelegate class conforms to the AVAudioPlayerDelegate protocol, which requires the implementation of the audioPlayerDidFinishPlaying(_:successfully:) method.
  3. Handling Music Playback Completion

    • The audioPlayerDidFinishPlaying(_:successfully:) method is called automatically by the AVAudioPlayer instance when the music finishes playing.
    • The method checks if the playback finished successfully using the flag parameter.
    • If the playback finished successfully, the method prints a debug message and calls the onMusicLoop closure to reset the game and restart the music.

B. Usage in the main view

The MusicDelegate is used in the HarmonyWithMe view to handle background music looping and game state resetting.

@State private var musicDelegate: MusicDelegate?

private func startBackgroundMusic() {
    backgroundPlayer = AudioManager.createAudioPlayer(filename: "testBGM", fileExtension: "m4a")

    guard let player = backgroundPlayer else {
        print("❌ Failed to create background player")
        return
    }
    AudioManager.fadeIn(player)

    musicDelegate = MusicDelegate {
        DispatchQueue.main.async {
            self.resetGame()
            self.startMusicTimer()
        }
    }
    player.delegate = musicDelegate
    player.numberOfLoops = -1
}
  1. Creating the Audio Player

    • The startBackgroundMusic() method creates an AVAudioPlayer instance for the background music.
  2. Setting the Delegate

    • A MusicDelegate instance is created with a closure that resets the game and restarts the music timer.
    • The musicDelegate property is assigned this instance.
    • The AVAudioPlayer instance's delegate property is set to the musicDelegate.
  3. Handling Music Looping

    • When the background music finishes playing, the audioPlayerDidFinishPlaying(_:successfully:) method of the MusicDelegate is called.
    • The onMusicLoop closure is executed, which resets the game state and restarts the music timer.

The MusicDelegate class is a crucial component in the HarmonyWithMe game, ensuring that the background music loops seamlessly and the game state is reset appropriately. By conforming to the AVAudioPlayerDelegate protocol and using a closure to define the actions on music completion, it provides a flexible and efficient way to manage background music playback and game state transitions.


Development - Feel Your Vibe

I. Game Philosophy

"Feel Your Vibe" is an interactive music game that allows users to dynamically control tremolo effects using intuitive swipe gestures. Vertical swipes adjust the tremolo frequency (speed), while horizontal movements control the tremolo depth (amplitude). The game features a custom Low-Frequency Oscillator (LFO) and frequency-adaptive haptic feedback for a rich sensory experience.

II. Architecture and Components

The game is built using a modular design, leveraging shared components and game-specific logic:

  • Audio System:
    • AudioManager: Manages audio session configuration, playback, and volume control.
    • LFO: A custom class responsible for generating the tremolo effect.
  • User Interaction:
    • DragGesture: Detects and processes swipe gestures.
    • Screen Center-crossing detection: Ensures accurate timing for frequency changes based on vertical swipes.
  • Haptic Feedback: HapticManager provides tactile feedback synchronized with the tremolo effect.
  • State Management: Manages playback state, LFO parameters (frequency and amplitude), and UI visibility.

III. Core Functionality

A. LFO (Low-Frequency Oscillator)

The LFO generates a sine wave to modulate the audio volume, creating the tremolo effect.

  • Class Structure:
class LFO {
    var frequency: Double = 1.0 // Oscillation speed (Hz)
    var amplitude: Double = 0.1 // Oscillation depth (0-1)
    private var phase: Double = 0.0

    func update(deltaTime: TimeInterval) -> Double {
        phase += 2 * .pi * frequency * deltaTime
        return amplitude * sin(phase)
    }
}
  • Key Functionality:
    • Wave Generation: Uses a sine wave function for smooth oscillation.
    • Parameter Control:
      • Frequency: Ranges from 0.1 Hz to 10.0 Hz.
      • Amplitude: Ranges from 0.0 (no effect) to 1.0 (maximum modulation).
    • Time Management: Uses deltaTime for precise phase calculation, ensuring consistent tremolo regardless of frame rate.

B. Gesture Handling and Parameter Mapping

Swipe gestures are processed to control the LFO parameters:

  1. Vertical Swipes (Frequency Control):

    • Screen Center Crossing Detection: Tracks when the swipe crosses the vertical midpoint of the screen.
    • Frequency Calculation: Calculates the time between center crossings to determine the swipe speed and map it to the LFO frequency. A moving average is used for smoothing.
    • Frequency Range: Constrained between 0.1 Hz and 10 Hz.
  2. Horizontal Movements (Amplitude Control):

    • Position Mapping: Maps the horizontal touch position to the LFO amplitude. Left side corresponds to minimum amplitude, right side to maximum.
    • Amplitude Scaling: amplitude = (position.x / screenWidth) * (maxAmp - minAmp) + minAmp
  3. LFO Update: The calculated frequency and amplitude values are applied to the LFO instance in each update cycle.

C. Haptic Feedback

The haptic feedback is frequency-adaptive:

  • Low Frequency (≤ 2 Hz): Continuous haptic patterns with longer duration and higher intensity.
  • High Frequency (> 2 Hz): Short, sharp pulses with lower intensity.
  • Gesture Response: Haptic feedback is triggered on screen center crossing, with intensity varying based on swipe speed.

IV. Game Flow and Lifecycle

A. Game Flow

The main game loop is as follows:

  1. Initialization: Audio, LFO, and haptic systems are initialized.
  2. User Input (Swipe): Gestures are detected and processed.
  3. Parameter Update: LFO frequency and amplitude are updated based on swipe data.
  4. Audio Modulation: The LFO output modulates the audio volume.
  5. Haptic Feedback: Haptic patterns are triggered based on the LFO frequency.
  6. Repeat Steps 2-5.

B. Lifecycle Methods

  • onAppear:
    • Configures the audio session.
    • Prepares the haptic engine.
    • Initializes the background music player and LFO.
    • Starts inactivity monitoring.
  • Gesture Handling: Continuously monitors DragGesture changes, updates LFO parameters, and resets the inactivity timer.
  • onDisappear:
    • Fades out the audio.
    • Stops and cleans up the LFO.
    • Invalidates timers.
    • Releases haptic resources.
    • Cleans up the audio session.

V. Key Features

  • Dynamic Tremolo Control: Intuitive gesture-based control of tremolo frequency and depth.
  • Custom LFO: Efficient sine wave generation for smooth tremolo effects.
  • Frequency-Adaptive Haptics: Enhanced tactile feedback synchronized with audio changes.
  • Minimalist UI: Focuses on the core interactive experience.

VI. State Variables (Examples)

  • lfo: The LFO instance.
  • currentFrequency: The current LFO frequency.
  • currentAmplitude: The current LFO amplitude.
  • lastTouchLocation: The last recorded touch position.
  • lastTouchTime: The timestamp of the last touch event.

Development - Touch for Music

I. Game Philosophy

"Touch for Music" is an interactive music game where users control music volume by touching and holding anywhere on the screen. A dynamic pulsing circle provides visual feedback synchronized with the audio changes. The game focuses on intuitive touch interaction and a minimalist aesthetic.

II. Architecture and Components

  • Audio System:
    • AudioManager: Manages audio session configuration, playback, and smooth volume transitions.
  • Visual Feedback:
    • Custom circle animation system using SwiftUI animations.
  • User Interaction:
    • DragGesture: Detects and tracks touch location and state (began, changed, ended).
  • State Management: Manages touch state, circle visibility, and inactivity timeout.

III. Core Functions

A. Circle Animation

The circle animation provides visual feedback synchronized with the audio volume.

  • Implementation: Uses SwiftUI's @State properties and animations:
struct TouchForMusic: View {
    @State private var circleScale: CGFloat = 1.0
    @State private var circleOpacity: Double = 0.0
    @State private var touchLocation: CGPoint?
    @State private var isTouching: Bool = false // Tracks touch state

    // Example animation:
    .scaleEffect(isTouching ? 1.2 : 1.0) // Scale up when touching
    .opacity(isTouching ? 1.0 : 0.5) // Change opacity based on touch state
    .animation(.easeInOut(duration: 0.2)) // Smooth animation
    // ...
}
  • Key Functionality:
    • Touch-Responsive Scaling and Opacity: The circle scales up slightly and becomes more opaque when the user touches the screen, providing immediate feedback.
    • Smooth Transitions: easeInOut animations are used for smooth transitions between states.
    • Position Tracking: The circle's position is updated to follow the touch location using the touchLocation state.

B. Touch Handling and Volume Control

The DragGesture is used to handle touch input and control the volume.

  • Gesture Handling:

    • The onChanged handler of the DragGesture updates the touchLocation state and calculates the volume based on the vertical touch position.
    • The onEnded handler resets the touchLocation and performs any necessary cleanup.
  • Volume Mapping:

    • The vertical touch position is mapped to a volume range (e.g., 0.0 to 1.0). Touching higher on the screen increases the volume, while touching lower decreases it.
    • A scaling function is used to ensure a smooth and proportional volume change.
// Example volume calculation:
let screenHeight = UIScreen.main.bounds.height
let touchY = touchLocation?.y ?? screenHeight / 2 // Default to center if no touch

let volume = 1.0 - (touchY / screenHeight) // Invert so higher touch = higher volume
audioManager.setVolume(Float(volume)) // Assuming AudioManager has a setVolume function

C. Inactivity Monitoring

A timer is used to detect user inactivity.

  • Implementation: A Timer is scheduled to fire periodically (e.g., every 1 second).
  • Timeout: If no touch events are detected within a predefined timeout period (e.g., 10 seconds), the home button is displayed.
  • Reset: Any touch event resets the inactivity timer.

IV. UI Elements

  • Pulsing Circle: The primary visual element, providing feedback on touch interaction and volume changes.
  • Home Button: Appears after inactivity, allowing users to navigate back to the main menu.
  • Initial Instructions: Brief instructions on how to use the game.

V. Lifecycle Methods

  • onAppear:
    • Configures the audio session.
    • Starts background music playback.
    • Starts inactivity monitoring.
  • onChanged (DragGesture):
    • Updates touchLocation.
    • Calculates and sets the audio volume.
    • Updates circle animation.
    • Resets inactivity timer.
  • onEnded (DragGesture):
    • Resets touchLocation and any relevant states.
  • onDisappear:
    • Fades out audio.
    • Invalidates timers.
    • Cleans up resources.

VI. State Variables Examples

  • circleScale: The current scale of the circle.
  • circleOpacity: The current opacity of the circle.
  • touchLocation: The current touch location.
  • isTouching: A boolean indicating whether the user is currently touching the screen.
  • inactivityTimer: The timer used for inactivity monitoring.
No Comments

Send Comment Edit Comment


				
|´・ω・)ノ
ヾ(≧∇≦*)ゝ
(☆ω☆)
(╯‵□′)╯︵┴─┴
 ̄﹃ ̄
(/ω\)
∠( ᐛ 」∠)_
(๑•̀ㅁ•́ฅ)
→_→
୧(๑•̀⌄•́๑)૭
٩(ˊᗜˋ*)و
(ノ°ο°)ノ
(´இ皿இ`)
⌇●﹏●⌇
(ฅ´ω`ฅ)
(╯°A°)╯︵○○○
φ( ̄∇ ̄o)
ヾ(´・ ・`。)ノ"
( ง ᵒ̌皿ᵒ̌)ง⁼³₌₃
(ó﹏ò。)
Σ(っ °Д °;)っ
( ,,´・ω・)ノ"(´っω・`。)
╮(╯▽╰)╭
o(*////▽////*)q
>﹏<
( ๑´•ω•) "(ㆆᴗㆆ)
😂
😀
😅
😊
🙂
🙃
😌
😍
😘
😜
😝
😏
😒
🙄
😳
😡
😔
😫
😱
😭
💩
👻
🙌
🖕
👍
👫
👬
👭
🌚
🌝
🙈
💊
😶
🙏
🍦
🍉
😣
Source: github.com/k4yt3x/flowerhd
颜文字
Emoji
小恐龙
花!
Previous
Next