Compare commits

...

2 commits

Author SHA1 Message Date
c3e4dc0e79 Add CLAUDE.md with architecture, patterns, and recent fixes
Documents the app's purpose, architecture, rendering pipeline,
build instructions, and key design decisions (bitmap lifecycle,
thread safety, error handling) established during the recent audit.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-05 13:46:50 +01:00
11a79076bc Fix concurrency, lifecycle, performance, and config issues from audit
Concurrency & bitmap lifecycle:
- Defer bitmap recycling by one cycle so Compose finishes drawing before
  native memory is freed (preview bitmaps, thumbnails)
- Make galleryPreviewSource @Volatile for cross-thread visibility
- Join preview job before recycling source bitmap in cancelGalleryPreview()
  to prevent use-after-free during CPU blur loop
- Add @Volatile to TiltShiftRenderer.currentTexCoords (UI/GL thread race)
- Fix error dismiss race with cancellable Job tracking

Lifecycle & resource management:
- Release GL resources via glSurfaceView.queueEvent (must run on GL thread)
- Pause GLSurfaceView when entering gallery preview mode
- Shut down captureExecutor in CameraManager.release() (thread leak)
- Use WeakReference for lifecycleOwnerRef to avoid Activity GC delay
- Fix thumbnail bitmap leak on coroutine cancellation (add to finally)
- Guarantee imageProxy.close() in finally block

Performance:
- Compute gradient mask at 1/4 resolution with bilinear upscale (~93%
  less per-pixel trig work, ~75% less mask memory)
- Precompute cos/sin on CPU, pass as uCosAngle/uSinAngle uniforms
  (eliminates per-fragment transcendental calls in GLSL)
- Unroll 9-tap Gaussian blur kernel (avoids integer-branched weight
  lookup that de-optimizes on mobile GPUs)
- Add 80ms debounce to preview recomputation during slider drags

Silent failure fixes:
- Check bitmap.compress() return value; report error on failure
- Log all loadBitmapFromUri null paths (stream, dimensions, decode)
- Surface preview computation errors and ActivityNotFoundException to user
- Return boolean from writeExifToUri, log at ERROR level
- Wrap gallery preview downscale in try-catch (OOM protection)

Config:
- Add ACCESS_MEDIA_LOCATION permission (GPS EXIF on Android 10+)
- Accept coarse-only location grant for geotags
- Remove dead adjustResize (no effect with edge-to-edge)
- Set windowBackground to black (eliminates white flash on cold start)
- Add values-night theme for dark mode
- Remove overly broad ProGuard keeps (CameraX/GMS ship consumer rules)

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-05 13:44:12 +01:00
14 changed files with 394 additions and 159 deletions

102
CLAUDE.md Normal file
View file

@ -0,0 +1,102 @@
# Tilt-Shift Camera
Android camera app that applies a real-time tilt-shift (miniature/diorama) blur effect to the camera preview and to imported gallery images. Built with Kotlin, Jetpack Compose, CameraX, and OpenGL ES 2.0.
## What it does
- Live camera preview with GPU-accelerated tilt-shift blur via GLSL fragment shader
- Gallery import with CPU-based preview that updates in real time as you adjust parameters
- Supports both **linear** (band) and **radial** (elliptical) blur modes
- Gesture controls: drag to position, pinch to resize, two-finger rotate
- Slider panel for precise control of blur, falloff, size, angle, aspect ratio
- Multi-lens support on devices with multiple back cameras
- EXIF GPS tagging from device location
- Saves processed images to MediaStore (scoped storage)
## Architecture
```
no.naiv.tiltshift/
MainActivity.kt # Entry point, permissions, edge-to-edge setup
ui/
CameraScreen.kt # Main Compose UI, GL surface, controls
CameraViewModel.kt # State management, gallery preview loop, bitmap lifecycle
TiltShiftOverlay.kt # Gesture handling and visual guides
ZoomControl.kt # Zoom presets and indicator
LensSwitcher.kt # Multi-lens picker
theme/AppColors.kt # Color constants
camera/
CameraManager.kt # CameraX lifecycle, zoom, lens binding
ImageCaptureHandler.kt # Capture pipeline, CPU blur/mask, gallery processing
LensController.kt # Enumerates physical camera lenses
effect/
TiltShiftRenderer.kt # GLSurfaceView.Renderer for live camera preview
TiltShiftShader.kt # Compiles GLSL, sets uniforms (incl. precomputed trig)
BlurParameters.kt # Data class for all effect parameters
storage/
PhotoSaver.kt # MediaStore writes, EXIF metadata, IS_PENDING pattern
SaveResult.kt # Sealed class for save outcomes
util/
LocationProvider.kt # FusedLocationProvider flow (accepts coarse or fine)
OrientationDetector.kt # Device rotation for EXIF
HapticFeedback.kt # Null-safe vibration wrapper
```
### Rendering pipeline
- **Camera preview**: OpenGL ES 2.0 via `GLSurfaceView` + `TiltShiftRenderer`. Camera frames arrive as `GL_TEXTURE_EXTERNAL_OES` from a `SurfaceTexture`. The fragment shader (`tiltshift_fragment.glsl`) applies blur per-fragment using precomputed `uCosAngle`/`uSinAngle` uniforms and an unrolled 9-tap Gaussian kernel.
- **Gallery preview**: CPU-based. A 1024px-max downscaled source is kept in `galleryPreviewSource`. `CameraViewModel.startPreviewLoop()` uses `collectLatest` on blur params (with 80ms debounce) to reactively recompute the preview via `ImageCaptureHandler.applyTiltShiftPreview()`.
- **Final save**: Full-resolution CPU pipeline — stack blur at 1/4 scale, gradient mask at 1/4 scale with bilinear upscale, per-pixel compositing. Camera captures save both original + processed; gallery imports save only the processed version (original already on device).
## Build & run
```bash
./gradlew assembleRelease # Build release APK
./gradlew compileDebugKotlin # Quick compile check
adb install -r app/build/outputs/apk/release/naiv-tilt-shift-release.apk
```
Signing config is loaded from `local.properties` (not committed).
## Key design decisions and patterns
### Bitmap lifecycle (important!)
Bitmaps emitted to `StateFlow`s are **never eagerly recycled** immediately after replacement. Compose may still be drawing the old bitmap in the current frame. Instead:
- A `pendingRecyclePreview` / `pendingRecycleThumbnail` field holds the bitmap from the *previous* update
- On the *next* update, the pending bitmap is recycled (Compose has had a full frame to finish)
- Final cleanup happens in `cancelGalleryPreview()` (which `join()`s the preview job first) and `onCleared()`
### Thread safety
- `galleryPreviewSource` is `@Volatile` (accessed from Main thread, IO dispatcher, and cancel path)
- `TiltShiftRenderer.currentTexCoords` is `@Volatile` (written by UI thread, read by GL thread)
- `cancelGalleryPreview()` cancels + `join()`s the preview job before recycling the source bitmap, because `applyTiltShiftEffect` is a long CPU loop with no suspension points
- GL resources are released via `glSurfaceView.queueEvent {}` (must run on GL thread)
- `CameraManager.captureExecutor` is shut down in `release()` to prevent thread leaks
### Error handling
- `bitmap.compress()` return value is checked; failure reported to user
- `loadBitmapFromUri()` logs all null-return paths (stream open, dimensions, decode)
- Error/success dismiss indicators use cancellable `Job` tracking to prevent race conditions
- `writeExifToUri()` returns boolean and logs at ERROR level on failure
## Permissions
| Permission | Purpose | Notes |
|-----------|---------|-------|
| `CAMERA` | Camera preview and capture | Required |
| `ACCESS_FINE_LOCATION` | GPS EXIF tagging | Optional; coarse-only grant also works |
| `ACCESS_COARSE_LOCATION` | GPS EXIF tagging | Fallback if fine denied |
| `ACCESS_MEDIA_LOCATION` | Read GPS from gallery images | Required on Android 10+ |
| `VIBRATE` | Haptic feedback | Always granted |
## Known limitations / future work
- `minSdk = 35` (Android 15) — intentional for personal use. Lower to 26-29 if distributing.
- Accompanist Permissions (`0.36.0`) is deprecated; should migrate to first-party `activity-compose` API.
- No user-facing toggle to disable GPS tagging — location is embedded whenever permission is granted.
- Dependencies are pinned to late-2024 versions; periodic bumps recommended.
- Fragment shader uses `int` uniform branching in GLSL ES 1.00 — works but could be cleaner with ES 3.00.

View file

@ -1,10 +1,5 @@
# Add project specific ProGuard rules here. # Add project specific ProGuard rules here.
# CameraX and GMS Location ship their own consumer ProGuard rules.
# Keep CameraX classes # Keep OpenGL shader-related code (accessed via reflection by GLSL pipeline)
-keep class androidx.camera.** { *; }
# Keep OpenGL shader-related code
-keep class no.naiv.tiltshift.effect.** { *; } -keep class no.naiv.tiltshift.effect.** { *; }
# Keep location provider
-keep class com.google.android.gms.location.** { *; }

View file

@ -6,10 +6,13 @@
<uses-feature android:name="android.hardware.camera" android:required="true" /> <uses-feature android:name="android.hardware.camera" android:required="true" />
<uses-permission android:name="android.permission.CAMERA" /> <uses-permission android:name="android.permission.CAMERA" />
<!-- Location for EXIF GPS data --> <!-- Location for EXIF GPS data (coarse is sufficient for geotags) -->
<uses-permission android:name="android.permission.ACCESS_FINE_LOCATION" /> <uses-permission android:name="android.permission.ACCESS_FINE_LOCATION" />
<uses-permission android:name="android.permission.ACCESS_COARSE_LOCATION" /> <uses-permission android:name="android.permission.ACCESS_COARSE_LOCATION" />
<!-- Required to read GPS from gallery images on Android 10+ -->
<uses-permission android:name="android.permission.ACCESS_MEDIA_LOCATION" />
<!-- Vibration for haptic feedback --> <!-- Vibration for haptic feedback -->
<uses-permission android:name="android.permission.VIBRATE" /> <uses-permission android:name="android.permission.VIBRATE" />
@ -30,7 +33,7 @@
android:exported="true" android:exported="true"
android:screenOrientation="fullSensor" android:screenOrientation="fullSensor"
android:configChanges="orientation|screenSize|screenLayout|keyboardHidden" android:configChanges="orientation|screenSize|screenLayout|keyboardHidden"
android:windowSoftInputMode="adjustResize" android:windowSoftInputMode="adjustNothing"
android:theme="@style/Theme.TiltShiftCamera"> android:theme="@style/Theme.TiltShiftCamera">
<intent-filter> <intent-filter>
<action android:name="android.intent.action.MAIN" /> <action android:name="android.intent.action.MAIN" />

View file

@ -18,7 +18,9 @@ import androidx.lifecycle.LifecycleOwner
import kotlinx.coroutines.flow.MutableStateFlow import kotlinx.coroutines.flow.MutableStateFlow
import kotlinx.coroutines.flow.StateFlow import kotlinx.coroutines.flow.StateFlow
import kotlinx.coroutines.flow.asStateFlow import kotlinx.coroutines.flow.asStateFlow
import java.lang.ref.WeakReference
import java.util.concurrent.Executor import java.util.concurrent.Executor
import java.util.concurrent.ExecutorService
import java.util.concurrent.Executors import java.util.concurrent.Executors
/** /**
@ -58,11 +60,12 @@ class CameraManager(private val context: Context) {
val isFrontCamera: StateFlow<Boolean> = _isFrontCamera.asStateFlow() val isFrontCamera: StateFlow<Boolean> = _isFrontCamera.asStateFlow()
/** Background executor for image capture callbacks to avoid blocking the main thread. */ /** Background executor for image capture callbacks to avoid blocking the main thread. */
private val captureExecutor: Executor = Executors.newSingleThreadExecutor() private val captureExecutor: ExecutorService = Executors.newSingleThreadExecutor()
private var surfaceTextureProvider: (() -> SurfaceTexture?)? = null private var surfaceTextureProvider: (() -> SurfaceTexture?)? = null
private var surfaceSize: Size = Size(1920, 1080) private var surfaceSize: Size = Size(1920, 1080)
private var lifecycleOwnerRef: LifecycleOwner? = null /** Weak reference to avoid preventing Activity GC across config changes. */
private var lifecycleOwnerRef: WeakReference<LifecycleOwner>? = null
/** /**
* Starts the camera with the given lifecycle owner. * Starts the camera with the given lifecycle owner.
@ -73,7 +76,7 @@ class CameraManager(private val context: Context) {
surfaceTextureProvider: () -> SurfaceTexture? surfaceTextureProvider: () -> SurfaceTexture?
) { ) {
this.surfaceTextureProvider = surfaceTextureProvider this.surfaceTextureProvider = surfaceTextureProvider
this.lifecycleOwnerRef = lifecycleOwner this.lifecycleOwnerRef = WeakReference(lifecycleOwner)
val cameraProviderFuture = ProcessCameraProvider.getInstance(context) val cameraProviderFuture = ProcessCameraProvider.getInstance(context)
cameraProviderFuture.addListener({ cameraProviderFuture.addListener({
@ -89,7 +92,10 @@ class CameraManager(private val context: Context) {
} }
private fun bindCameraUseCases(lifecycleOwner: LifecycleOwner) { private fun bindCameraUseCases(lifecycleOwner: LifecycleOwner) {
val provider = cameraProvider ?: return val provider = cameraProvider ?: run {
Log.w(TAG, "bindCameraUseCases called before camera provider initialized")
return
}
// Unbind all use cases before rebinding // Unbind all use cases before rebinding
provider.unbindAll() provider.unbindAll()
@ -164,12 +170,24 @@ class CameraManager(private val context: Context) {
} }
/** /**
* Sets the zoom ratio. * Sets the zoom ratio. Updates UI state only after the camera confirms the change.
*/ */
fun setZoom(ratio: Float) { fun setZoom(ratio: Float) {
val clamped = ratio.coerceIn(_minZoomRatio.value, _maxZoomRatio.value) val clamped = ratio.coerceIn(_minZoomRatio.value, _maxZoomRatio.value)
camera?.cameraControl?.setZoomRatio(clamped) val future = camera?.cameraControl?.setZoomRatio(clamped)
if (future != null) {
future.addListener({
try {
future.get()
_zoomRatio.value = clamped _zoomRatio.value = clamped
} catch (e: Exception) {
Log.w(TAG, "Zoom operation failed", e)
}
}, ContextCompat.getMainExecutor(context))
} else {
// Optimistic update when camera not available (e.g. during init)
_zoomRatio.value = clamped
}
} }
/** /**
@ -185,7 +203,7 @@ class CameraManager(private val context: Context) {
fun switchCamera() { fun switchCamera() {
_isFrontCamera.value = !_isFrontCamera.value _isFrontCamera.value = !_isFrontCamera.value
_zoomRatio.value = 1.0f // Reset zoom when switching _zoomRatio.value = 1.0f // Reset zoom when switching
lifecycleOwnerRef?.let { bindCameraUseCases(it) } lifecycleOwnerRef?.get()?.let { bindCameraUseCases(it) }
} }
/** /**
@ -207,10 +225,11 @@ class CameraManager(private val context: Context) {
} }
/** /**
* Releases camera resources. * Releases camera resources and shuts down the background executor.
*/ */
fun release() { fun release() {
cameraProvider?.unbindAll() cameraProvider?.unbindAll()
captureExecutor.shutdown()
camera = null camera = null
preview = null preview = null
imageCapture = null imageCapture = null

View file

@ -36,6 +36,8 @@ class ImageCaptureHandler(
private const val TAG = "ImageCaptureHandler" private const val TAG = "ImageCaptureHandler"
/** Maximum decoded image dimension to prevent OOM from huge gallery images. */ /** Maximum decoded image dimension to prevent OOM from huge gallery images. */
private const val MAX_IMAGE_DIMENSION = 4096 private const val MAX_IMAGE_DIMENSION = 4096
/** Scale factor for downscaling blur and mask computations. */
private const val SCALE_FACTOR = 4
} }
/** /**
@ -51,7 +53,7 @@ class ImageCaptureHandler(
* Captures a photo and applies the tilt-shift effect. * Captures a photo and applies the tilt-shift effect.
* *
* Phase 1 (inside suspendCancellableCoroutine / camera callback): * Phase 1 (inside suspendCancellableCoroutine / camera callback):
* decode rotate apply effect (synchronous CPU work only) * decode -> rotate -> apply effect (synchronous CPU work only)
* *
* Phase 2 (after continuation resumes, back in coroutine context): * Phase 2 (after continuation resumes, back in coroutine context):
* save bitmap via PhotoSaver (suspend-safe) * save bitmap via PhotoSaver (suspend-safe)
@ -75,7 +77,6 @@ class ImageCaptureHandler(
val imageRotation = imageProxy.imageInfo.rotationDegrees val imageRotation = imageProxy.imageInfo.rotationDegrees
currentBitmap = imageProxyToBitmap(imageProxy) currentBitmap = imageProxyToBitmap(imageProxy)
imageProxy.close()
if (currentBitmap == null) { if (currentBitmap == null) {
continuation.resume( continuation.resume(
@ -97,6 +98,8 @@ class ImageCaptureHandler(
continuation.resume( continuation.resume(
CaptureOutcome.Failed(SaveResult.Error("Failed to process image. Please try again.", e)) CaptureOutcome.Failed(SaveResult.Error("Failed to process image. Please try again.", e))
) )
} finally {
imageProxy.close()
} }
} }
@ -114,8 +117,9 @@ class ImageCaptureHandler(
return when (captureResult) { return when (captureResult) {
is CaptureOutcome.Failed -> captureResult.result is CaptureOutcome.Failed -> captureResult.result
is CaptureOutcome.Processed -> { is CaptureOutcome.Processed -> {
var thumbnail: Bitmap? = null
try { try {
val thumbnail = createThumbnail(captureResult.processed) thumbnail = createThumbnail(captureResult.processed)
val result = photoSaver.saveBitmapPair( val result = photoSaver.saveBitmapPair(
original = captureResult.original, original = captureResult.original,
processed = captureResult.processed, processed = captureResult.processed,
@ -123,12 +127,14 @@ class ImageCaptureHandler(
location = location location = location
) )
if (result is SaveResult.Success) { if (result is SaveResult.Success) {
result.copy(thumbnail = thumbnail) val output = result.copy(thumbnail = thumbnail)
thumbnail = null // prevent finally from recycling the returned thumbnail
output
} else { } else {
thumbnail?.recycle()
result result
} }
} finally { } finally {
thumbnail?.recycle()
captureResult.original.recycle() captureResult.original.recycle()
captureResult.processed.recycle() captureResult.processed.recycle()
} }
@ -206,7 +212,7 @@ class ImageCaptureHandler(
/** /**
* Processes an existing image from the gallery through the tilt-shift pipeline. * Processes an existing image from the gallery through the tilt-shift pipeline.
* Loads the image, applies EXIF rotation, processes the effect, and saves both versions. * Loads the image, applies EXIF rotation, processes the effect, and saves the result.
*/ */
suspend fun processExistingImage( suspend fun processExistingImage(
imageUri: Uri, imageUri: Uri,
@ -215,6 +221,7 @@ class ImageCaptureHandler(
): SaveResult = withContext(Dispatchers.IO) { ): SaveResult = withContext(Dispatchers.IO) {
var originalBitmap: Bitmap? = null var originalBitmap: Bitmap? = null
var processedBitmap: Bitmap? = null var processedBitmap: Bitmap? = null
var thumbnail: Bitmap? = null
try { try {
originalBitmap = loadBitmapFromUri(imageUri) originalBitmap = loadBitmapFromUri(imageUri)
?: return@withContext SaveResult.Error("Failed to load image") ?: return@withContext SaveResult.Error("Failed to load image")
@ -223,7 +230,7 @@ class ImageCaptureHandler(
processedBitmap = applyTiltShiftEffect(originalBitmap, blurParams) processedBitmap = applyTiltShiftEffect(originalBitmap, blurParams)
val thumbnail = createThumbnail(processedBitmap) thumbnail = createThumbnail(processedBitmap)
val result = photoSaver.saveBitmap( val result = photoSaver.saveBitmap(
bitmap = processedBitmap, bitmap = processedBitmap,
@ -232,9 +239,10 @@ class ImageCaptureHandler(
) )
if (result is SaveResult.Success) { if (result is SaveResult.Success) {
result.copy(thumbnail = thumbnail) val output = result.copy(thumbnail = thumbnail)
thumbnail = null // prevent finally from recycling the returned thumbnail
output
} else { } else {
thumbnail?.recycle()
result result
} }
} catch (e: SecurityException) { } catch (e: SecurityException) {
@ -244,6 +252,7 @@ class ImageCaptureHandler(
Log.e(TAG, "Gallery image processing failed", e) Log.e(TAG, "Gallery image processing failed", e)
SaveResult.Error("Failed to process image. Please try again.", e) SaveResult.Error("Failed to process image. Please try again.", e)
} finally { } finally {
thumbnail?.recycle()
originalBitmap?.recycle() originalBitmap?.recycle()
processedBitmap?.recycle() processedBitmap?.recycle()
} }
@ -259,6 +268,14 @@ class ImageCaptureHandler(
val options = BitmapFactory.Options().apply { inJustDecodeBounds = true } val options = BitmapFactory.Options().apply { inJustDecodeBounds = true }
context.contentResolver.openInputStream(uri)?.use { stream -> context.contentResolver.openInputStream(uri)?.use { stream ->
BitmapFactory.decodeStream(stream, null, options) BitmapFactory.decodeStream(stream, null, options)
} ?: run {
Log.e(TAG, "Could not open input stream for URI (dimensions pass): $uri")
return null
}
if (options.outWidth <= 0 || options.outHeight <= 0) {
Log.e(TAG, "Image has invalid dimensions: ${options.outWidth}x${options.outHeight}, mime: ${options.outMimeType}")
return null
} }
// Calculate sample size to stay within MAX_IMAGE_DIMENSION // Calculate sample size to stay within MAX_IMAGE_DIMENSION
@ -271,9 +288,15 @@ class ImageCaptureHandler(
// Second pass: decode with sample size // Second pass: decode with sample size
val decodeOptions = BitmapFactory.Options().apply { inSampleSize = sampleSize } val decodeOptions = BitmapFactory.Options().apply { inSampleSize = sampleSize }
context.contentResolver.openInputStream(uri)?.use { stream -> val bitmap = context.contentResolver.openInputStream(uri)?.use { stream ->
BitmapFactory.decodeStream(stream, null, decodeOptions) BitmapFactory.decodeStream(stream, null, decodeOptions)
} }
if (bitmap == null) {
Log.e(TAG, "BitmapFactory.decodeStream returned null for URI: $uri (mime: ${options.outMimeType})")
}
bitmap
} catch (e: SecurityException) { } catch (e: SecurityException) {
Log.e(TAG, "Permission denied loading bitmap from URI", e) Log.e(TAG, "Permission denied loading bitmap from URI", e)
null null
@ -340,6 +363,9 @@ class ImageCaptureHandler(
* Applies tilt-shift blur effect to a bitmap. * Applies tilt-shift blur effect to a bitmap.
* Supports both linear and radial modes. * Supports both linear and radial modes.
* *
* The gradient mask is computed at 1/4 resolution (matching the blur downscale)
* and upscaled for compositing, reducing peak memory by ~93%.
*
* All intermediate bitmaps are tracked and recycled in a finally block * All intermediate bitmaps are tracked and recycled in a finally block
* so that an OOM or other exception does not leak native memory. * so that an OOM or other exception does not leak native memory.
*/ */
@ -351,14 +377,12 @@ class ImageCaptureHandler(
var scaled: Bitmap? = null var scaled: Bitmap? = null
var blurred: Bitmap? = null var blurred: Bitmap? = null
var blurredFullSize: Bitmap? = null var blurredFullSize: Bitmap? = null
var mask: Bitmap? = null
try { try {
result = Bitmap.createBitmap(width, height, Bitmap.Config.ARGB_8888) result = Bitmap.createBitmap(width, height, Bitmap.Config.ARGB_8888)
val scaleFactor = 4 val blurredWidth = width / SCALE_FACTOR
val blurredWidth = width / scaleFactor val blurredHeight = height / SCALE_FACTOR
val blurredHeight = height / scaleFactor
scaled = Bitmap.createScaledBitmap(source, blurredWidth, blurredHeight, true) scaled = Bitmap.createScaledBitmap(source, blurredWidth, blurredHeight, true)
@ -370,24 +394,22 @@ class ImageCaptureHandler(
blurred.recycle() blurred.recycle()
blurred = null blurred = null
mask = createGradientMask(width, height, params) // Compute mask at reduced resolution and upscale to avoid full-res per-pixel trig
val maskPixels = createGradientMaskPixels(blurredWidth, blurredHeight, params)
val fullMaskPixels = upscaleMask(maskPixels, blurredWidth, blurredHeight, width, height)
// Composite: blend original with blurred based on mask // Composite: blend original with blurred based on mask
val pixels = IntArray(width * height) val pixels = IntArray(width * height)
val blurredPixels = IntArray(width * height) val blurredPixels = IntArray(width * height)
val maskPixels = IntArray(width * height)
source.getPixels(pixels, 0, width, 0, 0, width, height) source.getPixels(pixels, 0, width, 0, 0, width, height)
blurredFullSize.getPixels(blurredPixels, 0, width, 0, 0, width, height) blurredFullSize.getPixels(blurredPixels, 0, width, 0, 0, width, height)
mask.getPixels(maskPixels, 0, width, 0, 0, width, height)
blurredFullSize.recycle() blurredFullSize.recycle()
blurredFullSize = null blurredFullSize = null
mask.recycle()
mask = null
for (i in pixels.indices) { for (i in pixels.indices) {
val maskAlpha = (maskPixels[i] and 0xFF) / 255f val maskAlpha = (fullMaskPixels[i] and 0xFF) / 255f
val origR = (pixels[i] shr 16) and 0xFF val origR = (pixels[i] shr 16) and 0xFF
val origG = (pixels[i] shr 8) and 0xFF val origG = (pixels[i] shr 8) and 0xFF
val origB = pixels[i] and 0xFF val origB = pixels[i] and 0xFF
@ -412,16 +434,14 @@ class ImageCaptureHandler(
scaled?.recycle() scaled?.recycle()
blurred?.recycle() blurred?.recycle()
blurredFullSize?.recycle() blurredFullSize?.recycle()
mask?.recycle()
} }
} }
/** /**
* Creates a gradient mask for the tilt-shift effect. * Creates a gradient mask as a pixel array at the given dimensions.
* Supports both linear and radial modes. * Returns packed ARGB ints where the blue channel encodes blur amount.
*/ */
private fun createGradientMask(width: Int, height: Int, params: BlurParameters): Bitmap { private fun createGradientMaskPixels(width: Int, height: Int, params: BlurParameters): IntArray {
val mask = Bitmap.createBitmap(width, height, Bitmap.Config.ARGB_8888)
val pixels = IntArray(width * height) val pixels = IntArray(width * height)
val centerX = width * params.positionX val centerX = width * params.positionX
@ -437,32 +457,22 @@ class ImageCaptureHandler(
for (x in 0 until width) { for (x in 0 until width) {
val dist = when (params.mode) { val dist = when (params.mode) {
BlurMode.LINEAR -> { BlurMode.LINEAR -> {
// Rotate point around focus center
val dx = x - centerX val dx = x - centerX
val dy = y - centerY val dy = y - centerY
val rotatedY = -dx * sinAngle + dy * cosAngle val rotatedY = -dx * sinAngle + dy * cosAngle
kotlin.math.abs(rotatedY) kotlin.math.abs(rotatedY)
} }
BlurMode.RADIAL -> { BlurMode.RADIAL -> {
// Calculate elliptical distance from center
var dx = x - centerX var dx = x - centerX
var dy = y - centerY var dy = y - centerY
// Adjust for screen aspect ratio
dx *= screenAspect dx *= screenAspect
// Rotate
val rotatedX = dx * cosAngle - dy * sinAngle val rotatedX = dx * cosAngle - dy * sinAngle
val rotatedY = dx * sinAngle + dy * cosAngle val rotatedY = dx * sinAngle + dy * cosAngle
// Apply ellipse aspect ratio
val adjustedX = rotatedX / params.aspectRatio val adjustedX = rotatedX / params.aspectRatio
sqrt(adjustedX * adjustedX + rotatedY * rotatedY) sqrt(adjustedX * adjustedX + rotatedY * rotatedY)
} }
} }
// Calculate blur amount based on distance from focus region
val blurAmount = when { val blurAmount = when {
dist < focusSize -> 0f dist < focusSize -> 0f
dist < focusSize + transitionSize -> { dist < focusSize + transitionSize -> {
@ -476,8 +486,48 @@ class ImageCaptureHandler(
} }
} }
mask.setPixels(pixels, 0, width, 0, 0, width, height) return pixels
return mask }
/**
* Bilinear upscale of a mask pixel array from small dimensions to full dimensions.
*/
private fun upscaleMask(
smallPixels: IntArray,
smallW: Int, smallH: Int,
fullW: Int, fullH: Int
): IntArray {
val fullPixels = IntArray(fullW * fullH)
val xRatio = smallW.toFloat() / fullW
val yRatio = smallH.toFloat() / fullH
for (y in 0 until fullH) {
val srcY = y * yRatio
val y0 = srcY.toInt().coerceIn(0, smallH - 1)
val y1 = (y0 + 1).coerceIn(0, smallH - 1)
val yFrac = srcY - y0
for (x in 0 until fullW) {
val srcX = x * xRatio
val x0 = srcX.toInt().coerceIn(0, smallW - 1)
val x1 = (x0 + 1).coerceIn(0, smallW - 1)
val xFrac = srcX - x0
// Bilinear interpolation on the blue channel (all channels are equal)
val v00 = smallPixels[y0 * smallW + x0] and 0xFF
val v10 = smallPixels[y0 * smallW + x1] and 0xFF
val v01 = smallPixels[y1 * smallW + x0] and 0xFF
val v11 = smallPixels[y1 * smallW + x1] and 0xFF
val top = v00 + (v10 - v00) * xFrac
val bottom = v01 + (v11 - v01) * xFrac
val gray = (top + (bottom - top) * yFrac).toInt().coerceIn(0, 255)
fullPixels[y * fullW + x] = (0xFF shl 24) or (gray shl 16) or (gray shl 8) or gray
}
}
return fullPixels
} }
/** /**

View file

@ -65,6 +65,7 @@ class TiltShiftRenderer(
1f, 0f // Top right of screen 1f, 0f // Top right of screen
) )
@Volatile
private var currentTexCoords = texCoordsBack private var currentTexCoords = texCoordsBack
override fun onSurfaceCreated(gl: GL10?, config: EGLConfig?) { override fun onSurfaceCreated(gl: GL10?, config: EGLConfig?) {

View file

@ -4,6 +4,8 @@ import android.content.Context
import android.opengl.GLES11Ext import android.opengl.GLES11Ext
import android.opengl.GLES20 import android.opengl.GLES20
import no.naiv.tiltshift.R import no.naiv.tiltshift.R
import kotlin.math.cos
import kotlin.math.sin
import java.io.BufferedReader import java.io.BufferedReader
import java.io.InputStreamReader import java.io.InputStreamReader
@ -33,6 +35,8 @@ class TiltShiftShader(private val context: Context) {
private var uFalloffLocation: Int = 0 private var uFalloffLocation: Int = 0
private var uAspectRatioLocation: Int = 0 private var uAspectRatioLocation: Int = 0
private var uResolutionLocation: Int = 0 private var uResolutionLocation: Int = 0
private var uCosAngleLocation: Int = 0
private var uSinAngleLocation: Int = 0
/** /**
* Compiles and links the shader program. * Compiles and links the shader program.
@ -75,6 +79,8 @@ class TiltShiftShader(private val context: Context) {
uFalloffLocation = GLES20.glGetUniformLocation(programId, "uFalloff") uFalloffLocation = GLES20.glGetUniformLocation(programId, "uFalloff")
uAspectRatioLocation = GLES20.glGetUniformLocation(programId, "uAspectRatio") uAspectRatioLocation = GLES20.glGetUniformLocation(programId, "uAspectRatio")
uResolutionLocation = GLES20.glGetUniformLocation(programId, "uResolution") uResolutionLocation = GLES20.glGetUniformLocation(programId, "uResolution")
uCosAngleLocation = GLES20.glGetUniformLocation(programId, "uCosAngle")
uSinAngleLocation = GLES20.glGetUniformLocation(programId, "uSinAngle")
// Clean up shaders (they're linked into program now) // Clean up shaders (they're linked into program now)
GLES20.glDeleteShader(vertexShader) GLES20.glDeleteShader(vertexShader)
@ -103,6 +109,16 @@ class TiltShiftShader(private val context: Context) {
GLES20.glUniform1f(uFalloffLocation, params.falloff) GLES20.glUniform1f(uFalloffLocation, params.falloff)
GLES20.glUniform1f(uAspectRatioLocation, params.aspectRatio) GLES20.glUniform1f(uAspectRatioLocation, params.aspectRatio)
GLES20.glUniform2f(uResolutionLocation, width.toFloat(), height.toFloat()) GLES20.glUniform2f(uResolutionLocation, width.toFloat(), height.toFloat())
// Precompute angle trig on CPU to avoid per-fragment transcendental calls.
// The adjusted angle accounts for the 90deg coordinate transform.
val adjustedAngle = if (isFrontCamera) {
-params.angle - (Math.PI / 2).toFloat()
} else {
params.angle + (Math.PI / 2).toFloat()
}
GLES20.glUniform1f(uCosAngleLocation, cos(adjustedAngle))
GLES20.glUniform1f(uSinAngleLocation, sin(adjustedAngle))
} }
/** /**

View file

@ -102,7 +102,10 @@ class PhotoSaver(private val context: Context) {
) ?: return SaveResult.Error("Failed to create MediaStore entry") ) ?: return SaveResult.Error("Failed to create MediaStore entry")
contentResolver.openOutputStream(uri)?.use { outputStream -> contentResolver.openOutputStream(uri)?.use { outputStream ->
bitmap.compress(Bitmap.CompressFormat.JPEG, 95, outputStream) if (!bitmap.compress(Bitmap.CompressFormat.JPEG, 95, outputStream)) {
Log.e(TAG, "Bitmap compression returned false")
return SaveResult.Error("Failed to compress image")
}
} ?: return SaveResult.Error("Failed to open output stream") } ?: return SaveResult.Error("Failed to open output stream")
writeExifToUri(uri, orientation, location) writeExifToUri(uri, orientation, location)
@ -121,8 +124,11 @@ class PhotoSaver(private val context: Context) {
} }
} }
private fun writeExifToUri(uri: Uri, orientation: Int, location: Location?) { /**
try { * Writes EXIF metadata to a saved image. Returns false if writing failed.
*/
private fun writeExifToUri(uri: Uri, orientation: Int, location: Location?): Boolean {
return try {
context.contentResolver.openFileDescriptor(uri, "rw")?.use { pfd -> context.contentResolver.openFileDescriptor(uri, "rw")?.use { pfd ->
val exif = ExifInterface(pfd.fileDescriptor) val exif = ExifInterface(pfd.fileDescriptor)
@ -142,8 +148,10 @@ class PhotoSaver(private val context: Context) {
exif.saveAttributes() exif.saveAttributes()
} }
true
} catch (e: Exception) { } catch (e: Exception) {
Log.w(TAG, "Failed to write EXIF data", e) Log.e(TAG, "Failed to write EXIF data", e)
false
} }
} }

View file

@ -165,10 +165,19 @@ fun CameraScreen(
} }
} }
// Cleanup GL resources (ViewModel handles its own cleanup in onCleared) // Pause/resume GLSurfaceView when entering/leaving gallery preview
LaunchedEffect(isGalleryPreview) {
if (isGalleryPreview) {
glSurfaceView?.onPause()
} else {
glSurfaceView?.onResume()
}
}
// Cleanup GL resources on GL thread (ViewModel handles its own cleanup in onCleared)
DisposableEffect(Unit) { DisposableEffect(Unit) {
onDispose { onDispose {
renderer?.release() glSurfaceView?.queueEvent { renderer?.release() }
} }
} }
@ -460,6 +469,7 @@ fun CameraScreen(
context.startActivity(intent) context.startActivity(intent)
} catch (e: android.content.ActivityNotFoundException) { } catch (e: android.content.ActivityNotFoundException) {
Log.w("CameraScreen", "No activity found to view image", e) Log.w("CameraScreen", "No activity found to view image", e)
viewModel.showCameraError("No app available to view photos")
} }
} }
}, },

View file

@ -10,6 +10,7 @@ import androidx.lifecycle.AndroidViewModel
import androidx.lifecycle.viewModelScope import androidx.lifecycle.viewModelScope
import kotlinx.coroutines.Dispatchers import kotlinx.coroutines.Dispatchers
import kotlinx.coroutines.Job import kotlinx.coroutines.Job
import kotlinx.coroutines.delay
import kotlinx.coroutines.flow.MutableStateFlow import kotlinx.coroutines.flow.MutableStateFlow
import kotlinx.coroutines.flow.StateFlow import kotlinx.coroutines.flow.StateFlow
import kotlinx.coroutines.flow.asStateFlow import kotlinx.coroutines.flow.asStateFlow
@ -28,6 +29,12 @@ import no.naiv.tiltshift.util.OrientationDetector
/** /**
* ViewModel for the camera screen. * ViewModel for the camera screen.
* Survives configuration changes (rotation) and process death (via SavedStateHandle for primitives). * Survives configuration changes (rotation) and process death (via SavedStateHandle for primitives).
*
* Bitmap lifecycle: bitmaps emitted to StateFlows are never eagerly recycled,
* because Compose may still be drawing them on the next frame. Instead, the
* previous bitmap is stored and recycled only when the *next* replacement arrives,
* giving Compose at least one full frame to finish. Final cleanup happens in
* cancelGalleryPreview() and onCleared().
*/ */
class CameraViewModel(application: Application) : AndroidViewModel(application) { class CameraViewModel(application: Application) : AndroidViewModel(application) {
@ -35,6 +42,8 @@ class CameraViewModel(application: Application) : AndroidViewModel(application)
private const val TAG = "CameraViewModel" private const val TAG = "CameraViewModel"
/** Max dimension for the preview source bitmap to keep effect computation fast. */ /** Max dimension for the preview source bitmap to keep effect computation fast. */
private const val PREVIEW_MAX_DIMENSION = 1024 private const val PREVIEW_MAX_DIMENSION = 1024
/** Debounce delay before recomputing preview to reduce GC pressure during slider drags. */
private const val PREVIEW_DEBOUNCE_MS = 80L
} }
val cameraManager = CameraManager(application) val cameraManager = CameraManager(application)
@ -75,12 +84,16 @@ class CameraViewModel(application: Application) : AndroidViewModel(application)
val galleryImageUri: StateFlow<Uri?> = _galleryImageUri.asStateFlow() val galleryImageUri: StateFlow<Uri?> = _galleryImageUri.asStateFlow()
/** Downscaled source for fast preview recomputation. */ /** Downscaled source for fast preview recomputation. */
@Volatile
private var galleryPreviewSource: Bitmap? = null private var galleryPreviewSource: Bitmap? = null
/** Processed preview bitmap shown in the UI. */ /** Processed preview bitmap shown in the UI. */
private val _galleryPreviewBitmap = MutableStateFlow<Bitmap?>(null) private val _galleryPreviewBitmap = MutableStateFlow<Bitmap?>(null)
val galleryPreviewBitmap: StateFlow<Bitmap?> = _galleryPreviewBitmap.asStateFlow() val galleryPreviewBitmap: StateFlow<Bitmap?> = _galleryPreviewBitmap.asStateFlow()
/** Previous preview bitmap, kept alive one extra cycle so Compose can finish drawing it. */
private var pendingRecyclePreview: Bitmap? = null
private var previewJob: Job? = null private var previewJob: Job? = null
val isGalleryPreview: Boolean get() = _galleryBitmap.value != null val isGalleryPreview: Boolean get() = _galleryBitmap.value != null
@ -96,6 +109,10 @@ class CameraViewModel(application: Application) : AndroidViewModel(application)
private val _isProcessing = MutableStateFlow(false) private val _isProcessing = MutableStateFlow(false)
val isProcessing: StateFlow<Boolean> = _isProcessing.asStateFlow() val isProcessing: StateFlow<Boolean> = _isProcessing.asStateFlow()
// Dismiss jobs for timed indicators
private var errorDismissJob: Job? = null
private var successDismissJob: Job? = null
fun updateBlurParams(params: BlurParameters) { fun updateBlurParams(params: BlurParameters) {
_blurParams.value = params _blurParams.value = params
} }
@ -127,7 +144,8 @@ class CameraViewModel(application: Application) : AndroidViewModel(application)
_galleryImageUri.value = uri _galleryImageUri.value = uri
// Create downscaled source for fast preview recomputation // Create downscaled source for fast preview recomputation
galleryPreviewSource = withContext(Dispatchers.IO) { val previewSource = try {
withContext(Dispatchers.IO) {
val maxDim = maxOf(bitmap.width, bitmap.height) val maxDim = maxOf(bitmap.width, bitmap.height)
if (maxDim > PREVIEW_MAX_DIMENSION) { if (maxDim > PREVIEW_MAX_DIMENSION) {
val scale = PREVIEW_MAX_DIMENSION.toFloat() / maxDim val scale = PREVIEW_MAX_DIMENSION.toFloat() / maxDim
@ -141,8 +159,19 @@ class CameraViewModel(application: Application) : AndroidViewModel(application)
bitmap.copy(bitmap.config ?: Bitmap.Config.ARGB_8888, false) bitmap.copy(bitmap.config ?: Bitmap.Config.ARGB_8888, false)
} }
} }
} catch (e: Exception) {
Log.e(TAG, "Failed to create preview source", e)
null
}
if (previewSource != null) {
galleryPreviewSource = previewSource
startPreviewLoop() startPreviewLoop()
} else {
haptics.error()
showError("Failed to prepare image for preview")
cancelGalleryPreview()
}
} else { } else {
haptics.error() haptics.error()
showError("Failed to load image") showError("Failed to load image")
@ -150,27 +179,41 @@ class CameraViewModel(application: Application) : AndroidViewModel(application)
} }
} }
/** Reactively recomputes the tilt-shift preview when blur params change. */ /**
* Reactively recomputes the tilt-shift preview when blur params change.
* Uses debounce to reduce allocations during rapid slider drags.
* Old preview bitmaps are recycled one cycle late to avoid racing with Compose draws.
*/
private fun startPreviewLoop() { private fun startPreviewLoop() {
previewJob?.cancel() previewJob?.cancel()
previewJob = viewModelScope.launch { previewJob = viewModelScope.launch {
_blurParams.collectLatest { params -> _blurParams.collectLatest { params ->
delay(PREVIEW_DEBOUNCE_MS)
val source = galleryPreviewSource ?: return@collectLatest val source = galleryPreviewSource ?: return@collectLatest
try { try {
val processed = captureHandler.applyTiltShiftPreview(source, params) val processed = captureHandler.applyTiltShiftPreview(source, params)
val old = _galleryPreviewBitmap.value // Recycle the bitmap from two updates ago (Compose has had time to finish)
pendingRecyclePreview?.recycle()
// The current preview becomes pending; the new one becomes current
pendingRecyclePreview = _galleryPreviewBitmap.value
_galleryPreviewBitmap.value = processed _galleryPreviewBitmap.value = processed
old?.recycle()
} catch (e: Exception) { } catch (e: Exception) {
Log.w(TAG, "Preview computation failed", e) Log.e(TAG, "Preview computation failed", e)
showError("Preview update failed")
} }
} }
} }
} }
fun cancelGalleryPreview() { fun cancelGalleryPreview() {
previewJob?.cancel() // Cancel the preview job and wait for its CPU work to finish
// so we don't recycle galleryPreviewSource while it's being read
val job = previewJob
previewJob = null previewJob = null
job?.cancel()
viewModelScope.launch {
job?.join()
val oldGallery = _galleryBitmap.value val oldGallery = _galleryBitmap.value
val oldPreview = _galleryPreviewBitmap.value val oldPreview = _galleryPreviewBitmap.value
@ -180,9 +223,12 @@ class CameraViewModel(application: Application) : AndroidViewModel(application)
oldGallery?.recycle() oldGallery?.recycle()
oldPreview?.recycle() oldPreview?.recycle()
pendingRecyclePreview?.recycle()
pendingRecyclePreview = null
galleryPreviewSource?.recycle() galleryPreviewSource?.recycle()
galleryPreviewSource = null galleryPreviewSource = null
} }
}
fun applyGalleryEffect() { fun applyGalleryEffect() {
val uri = _galleryImageUri.value ?: return val uri = _galleryImageUri.value ?: return
@ -226,17 +272,22 @@ class CameraViewModel(application: Application) : AndroidViewModel(application)
} }
} }
/** Previous thumbnail kept one cycle for Compose to finish drawing. */
private var pendingRecycleThumbnail: Bitmap? = null
private fun handleSaveResult(result: SaveResult) { private fun handleSaveResult(result: SaveResult) {
when (result) { when (result) {
is SaveResult.Success -> { is SaveResult.Success -> {
haptics.success() haptics.success()
val oldThumb = _lastThumbnailBitmap.value // Recycle the thumbnail from two updates ago (safe from Compose)
pendingRecycleThumbnail?.recycle()
pendingRecycleThumbnail = _lastThumbnailBitmap.value
_lastThumbnailBitmap.value = result.thumbnail _lastThumbnailBitmap.value = result.thumbnail
_lastSavedUri.value = result.uri _lastSavedUri.value = result.uri
oldThumb?.recycle() successDismissJob?.cancel()
viewModelScope.launch { successDismissJob = viewModelScope.launch {
_showSaveSuccess.value = true _showSaveSuccess.value = true
kotlinx.coroutines.delay(1500) delay(1500)
_showSaveSuccess.value = false _showSaveSuccess.value = false
} }
} }
@ -248,9 +299,10 @@ class CameraViewModel(application: Application) : AndroidViewModel(application)
} }
private fun showError(message: String) { private fun showError(message: String) {
viewModelScope.launch { errorDismissJob?.cancel()
_showSaveError.value = message _showSaveError.value = message
kotlinx.coroutines.delay(2000) errorDismissJob = viewModelScope.launch {
delay(2000)
_showSaveError.value = null _showSaveError.value = null
} }
} }
@ -263,8 +315,10 @@ class CameraViewModel(application: Application) : AndroidViewModel(application)
super.onCleared() super.onCleared()
cameraManager.release() cameraManager.release()
_lastThumbnailBitmap.value?.recycle() _lastThumbnailBitmap.value?.recycle()
pendingRecycleThumbnail?.recycle()
_galleryBitmap.value?.recycle() _galleryBitmap.value?.recycle()
_galleryPreviewBitmap.value?.recycle() _galleryPreviewBitmap.value?.recycle()
pendingRecyclePreview?.recycle()
galleryPreviewSource?.recycle() galleryPreviewSource?.recycle()
} }
} }

View file

@ -79,6 +79,10 @@ class LocationProvider(private val context: Context) {
return ContextCompat.checkSelfPermission( return ContextCompat.checkSelfPermission(
context, context,
Manifest.permission.ACCESS_FINE_LOCATION Manifest.permission.ACCESS_FINE_LOCATION
) == PackageManager.PERMISSION_GRANTED ||
ContextCompat.checkSelfPermission(
context,
Manifest.permission.ACCESS_COARSE_LOCATION
) == PackageManager.PERMISSION_GRANTED ) == PackageManager.PERMISSION_GRANTED
} }
} }

View file

@ -20,6 +20,10 @@ uniform float uFalloff; // Transition sharpness (0-1, higher = more grad
uniform float uAspectRatio; // Ellipse aspect ratio for radial mode uniform float uAspectRatio; // Ellipse aspect ratio for radial mode
uniform vec2 uResolution; // Texture resolution for proper sampling uniform vec2 uResolution; // Texture resolution for proper sampling
// Precomputed trig for the adjusted angle (avoids per-fragment cos/sin calls)
uniform float uCosAngle;
uniform float uSinAngle;
varying vec2 vTexCoord; varying vec2 vTexCoord;
// Calculate signed distance from the focus region for LINEAR mode // Calculate signed distance from the focus region for LINEAR mode
@ -37,25 +41,11 @@ float linearFocusDistance(vec2 uv) {
vec2 offset = uv - center; vec2 offset = uv - center;
// Correct for screen aspect ratio to make coordinate space square // Correct for screen aspect ratio to make coordinate space square
// After transform: offset.x = screen Y direction, offset.y = screen X direction
// Scale offset.y to match the scale of offset.x (height units)
float screenAspect = uResolution.x / uResolution.y; float screenAspect = uResolution.x / uResolution.y;
offset.y *= screenAspect; offset.y *= screenAspect;
// Adjust angle to compensate for the coordinate transformation // Use precomputed cos/sin for the adjusted angle
// Back camera: +90° for the 90° CW rotation float rotatedY = -offset.x * uSinAngle + offset.y * uCosAngle;
// Front camera: -90° (negated due to X flip mirror effect)
float adjustedAngle;
if (uIsFrontCamera == 1) {
adjustedAngle = -uAngle - 1.5707963;
} else {
adjustedAngle = uAngle + 1.5707963;
}
float cosA = cos(adjustedAngle);
float sinA = sin(adjustedAngle);
// After rotation, measure perpendicular distance from center line
float rotatedY = -offset.x * sinA + offset.y * cosA;
return abs(rotatedY); return abs(rotatedY);
} }
@ -63,7 +53,6 @@ float linearFocusDistance(vec2 uv) {
// Calculate signed distance from the focus region for RADIAL mode // Calculate signed distance from the focus region for RADIAL mode
float radialFocusDistance(vec2 uv) { float radialFocusDistance(vec2 uv) {
// Center point of the focus region // Center point of the focus region
// Transform from screen coordinates to texture coordinates
vec2 center; vec2 center;
if (uIsFrontCamera == 1) { if (uIsFrontCamera == 1) {
center = vec2(1.0 - uPositionY, 1.0 - uPositionX); center = vec2(1.0 - uPositionY, 1.0 - uPositionX);
@ -72,24 +61,14 @@ float radialFocusDistance(vec2 uv) {
} }
vec2 offset = uv - center; vec2 offset = uv - center;
// Correct for screen aspect ratio to make coordinate space square // Correct for screen aspect ratio
// After transform: offset.x = screen Y direction, offset.y = screen X direction
// Scale offset.y to match the scale of offset.x (height units)
float screenAspect = uResolution.x / uResolution.y; float screenAspect = uResolution.x / uResolution.y;
offset.y *= screenAspect; offset.y *= screenAspect;
// Apply rotation with angle adjustment for coordinate transformation // Use precomputed cos/sin for rotation
float adjustedAngle;
if (uIsFrontCamera == 1) {
adjustedAngle = -uAngle - 1.5707963;
} else {
adjustedAngle = uAngle + 1.5707963;
}
float cosA = cos(adjustedAngle);
float sinA = sin(adjustedAngle);
vec2 rotated = vec2( vec2 rotated = vec2(
offset.x * cosA - offset.y * sinA, offset.x * uCosAngle - offset.y * uSinAngle,
offset.x * sinA + offset.y * cosA offset.x * uSinAngle + offset.y * uCosAngle
); );
// Apply ellipse aspect ratio // Apply ellipse aspect ratio
@ -114,26 +93,12 @@ float blurFactor(float dist) {
return smoothstep(0.0, 1.0, normalizedDist) * uBlurAmount; return smoothstep(0.0, 1.0, normalizedDist) * uBlurAmount;
} }
// Get Gaussian weight for blur kernel (9-tap, sigma ~= 2.0) // Sample with Gaussian blur (9-tap, sigma ~= 2.0, unrolled for GLSL ES 1.00 compatibility)
float getWeight(int i) {
if (i == 0) return 0.0162;
if (i == 1) return 0.0540;
if (i == 2) return 0.1216;
if (i == 3) return 0.1933;
if (i == 4) return 0.2258;
if (i == 5) return 0.1933;
if (i == 6) return 0.1216;
if (i == 7) return 0.0540;
return 0.0162; // i == 8
}
// Sample with Gaussian blur
vec4 sampleBlurred(vec2 uv, float blur) { vec4 sampleBlurred(vec2 uv, float blur) {
if (blur < 0.01) { if (blur < 0.01) {
return texture2D(uTexture, uv); return texture2D(uTexture, uv);
} }
vec4 color = vec4(0.0);
vec2 texelSize = 1.0 / uResolution; vec2 texelSize = 1.0 / uResolution;
// For radial mode, blur in radial direction from center // For radial mode, blur in radial direction from center
@ -141,7 +106,6 @@ vec4 sampleBlurred(vec2 uv, float blur) {
vec2 blurDir; vec2 blurDir;
if (uMode == 1) { if (uMode == 1) {
// Radial: blur away from center // Radial: blur away from center
// Transform from screen coordinates to texture coordinates
vec2 center; vec2 center;
if (uIsFrontCamera == 1) { if (uIsFrontCamera == 1) {
center = vec2(1.0 - uPositionY, 1.0 - uPositionX); center = vec2(1.0 - uPositionY, 1.0 - uPositionX);
@ -156,26 +120,25 @@ vec4 sampleBlurred(vec2 uv, float blur) {
blurDir = vec2(1.0, 0.0); blurDir = vec2(1.0, 0.0);
} }
} else { } else {
// Linear: blur perpendicular to focus line // Linear: blur perpendicular to focus line using precomputed trig
// Adjust angle for coordinate transformation blurDir = vec2(uCosAngle, uSinAngle);
float blurAngle;
if (uIsFrontCamera == 1) {
blurAngle = -uAngle - 1.5707963;
} else {
blurAngle = uAngle + 1.5707963;
}
blurDir = vec2(cos(blurAngle), sin(blurAngle));
} }
// Scale blur radius by blur amount // Scale blur radius by blur amount
float radius = blur * 20.0; float radius = blur * 20.0;
vec2 step = blurDir * texelSize * radius;
// 9-tap Gaussian blur // Unrolled 9-tap Gaussian blur (avoids integer-branched weight lookup)
for (int i = 0; i < 9; i++) { vec4 color = vec4(0.0);
float offset = float(i) - 4.0; color += texture2D(uTexture, uv + step * -4.0) * 0.0162;
vec2 samplePos = uv + blurDir * texelSize * offset * radius; color += texture2D(uTexture, uv + step * -3.0) * 0.0540;
color += texture2D(uTexture, samplePos) * getWeight(i); color += texture2D(uTexture, uv + step * -2.0) * 0.1216;
} color += texture2D(uTexture, uv + step * -1.0) * 0.1933;
color += texture2D(uTexture, uv) * 0.2258;
color += texture2D(uTexture, uv + step * 1.0) * 0.1933;
color += texture2D(uTexture, uv + step * 2.0) * 0.1216;
color += texture2D(uTexture, uv + step * 3.0) * 0.0540;
color += texture2D(uTexture, uv + step * 4.0) * 0.0162;
return color; return color;
} }

View file

@ -0,0 +1,9 @@
<?xml version="1.0" encoding="utf-8"?>
<resources>
<style name="Theme.TiltShiftCamera" parent="android:Theme.Material.NoActionBar">
<item name="android:statusBarColor">@android:color/transparent</item>
<item name="android:navigationBarColor">@android:color/transparent</item>
<item name="android:windowLayoutInDisplayCutoutMode">shortEdges</item>
<item name="android:windowBackground">@android:color/black</item>
</style>
</resources>

View file

@ -1,8 +1,9 @@
<?xml version="1.0" encoding="utf-8"?> <?xml version="1.0" encoding="utf-8"?>
<resources> <resources>
<style name="Theme.TiltShiftCamera" parent="android:Theme.Material.NoActionBar"> <style name="Theme.TiltShiftCamera" parent="android:Theme.Material.Light.NoActionBar">
<item name="android:statusBarColor">@android:color/transparent</item> <item name="android:statusBarColor">@android:color/transparent</item>
<item name="android:navigationBarColor">@android:color/transparent</item> <item name="android:navigationBarColor">@android:color/transparent</item>
<item name="android:windowLayoutInDisplayCutoutMode">shortEdges</item> <item name="android:windowLayoutInDisplayCutoutMode">shortEdges</item>
<item name="android:windowBackground">@android:color/black</item>
</style> </style>
</resources> </resources>