WIP: integrate-old-refactors-of-github #1
46
.gitignore
vendored
46
.gitignore
vendored
@@ -39,3 +39,49 @@ management-dashboard-web-app/users.txt
|
||||
result
|
||||
result-*
|
||||
.direnv/
|
||||
# Python
|
||||
__pycache__/
|
||||
*.py[cod]
|
||||
*.egg-info/
|
||||
.venv/
|
||||
.uv/
|
||||
*.env
|
||||
.env.*.local
|
||||
.host-ip
|
||||
.pytest_cache/
|
||||
.mypy_cache/
|
||||
|
||||
# Node / Vite
|
||||
node_modules/
|
||||
dist/
|
||||
.build/
|
||||
.vite/
|
||||
*.log
|
||||
|
||||
# Editor/OS
|
||||
.DS_Store
|
||||
Thumbs.db
|
||||
.vscode/
|
||||
.idea/
|
||||
|
||||
# API storage & logs
|
||||
camera-management-api/storage/
|
||||
camera-management-api/usda_vision_system.log
|
||||
|
||||
# Docker
|
||||
*.pid
|
||||
|
||||
camera-management-api/camera_sdk/
|
||||
camera-management-api/core
|
||||
management-dashboard-web-app/users.txt
|
||||
|
||||
# Jupyter Notebooks
|
||||
*.ipynb
|
||||
supabase/.temp/cli-latest
|
||||
|
||||
# Archive env backups (may contain secrets)
|
||||
archive/management-dashboard-web-app/env-backups/
|
||||
# Nix
|
||||
result
|
||||
result-*
|
||||
.direnv/
|
||||
|
||||
74
README.md
74
README.md
@@ -7,6 +7,13 @@ A unified monorepo combining the camera API service and the web dashboard for US
|
||||
- `camera-management-api/` - Python API service for camera management (USDA-Vision-Cameras)
|
||||
- `management-dashboard-web-app/` - React web dashboard for experiment management (pecan_experiments)
|
||||
- `supabase/` - Database configuration, migrations, and seed data (shared infrastructure)
|
||||
- `media-api/` - Python service for video/thumbnail serving (port 8090)
|
||||
- `video-remote/` - Frontend for video browsing (port 3001)
|
||||
- `vision-system-remote/` - Camera/vision UI (port 3002)
|
||||
- `scheduling-remote/` - Scheduling/availability UI (port 3003)
|
||||
- `scripts/` - Host IP, RTSP checks, env helpers (see [scripts/README.md](scripts/README.md))
|
||||
- `docs/` - Setup, Supabase, RTSP, design docs
|
||||
- `mediamtx.yml` - RTSP/WebRTC config for MediaMTX streaming
|
||||
|
||||
## Quick Start
|
||||
|
||||
@@ -28,15 +35,15 @@ The wrapper script automatically:
|
||||
|
||||
For more details, see [Docker Compose Environment Setup](docs/DOCKER_COMPOSE_ENV_SETUP.md).
|
||||
|
||||
- Web: <http://localhost:5173>
|
||||
- Web: <http://localhost:8080>
|
||||
- API: <http://localhost:8000>
|
||||
- MQTT broker: localhost:1883
|
||||
- MQTT is optional; configure in API config if used (see `.env.example`).
|
||||
|
||||
To stop: `docker compose down`
|
||||
|
||||
### Development Mode (Recommended for Development)
|
||||
|
||||
For development with live logging, debugging, and hot reloading:
|
||||
For development, use the same Docker Compose stack as production. The web app runs with the Vite dev server on port 8080 (hot reload); the API runs on port 8000.
|
||||
|
||||
1) Copy env template and set values (for web/Supabase):
|
||||
|
||||
@@ -45,36 +52,25 @@ cp .env.example .env
|
||||
# set VITE_SUPABASE_URL and VITE_SUPABASE_ANON_KEY in .env
|
||||
```
|
||||
|
||||
2) Start the development environment:
|
||||
2) Start the stack (with logs in the foreground, or add `-d` for detached):
|
||||
|
||||
```bash
|
||||
./dev-start.sh
|
||||
./docker-compose.sh up --build
|
||||
```
|
||||
|
||||
This will:
|
||||
- Start containers with debug logging enabled
|
||||
- Enable hot reloading for both API and web services
|
||||
- Show all logs in real-time
|
||||
- Keep containers running for debugging
|
||||
|
||||
**Development URLs:**
|
||||
- Web: <http://localhost:8080> (with hot reloading)
|
||||
- API: <http://localhost:8000> (with debug logging)
|
||||
- Web: <http://localhost:8080> (Vite dev server with hot reload)
|
||||
- API: <http://localhost:8000>
|
||||
|
||||
**Development Commands:**
|
||||
- `./dev-start.sh` - Start development environment
|
||||
- `./dev-stop.sh` - Stop development environment
|
||||
- `./dev-logs.sh` - View logs (use `-f` to follow, `-t N` for last N lines)
|
||||
- `./dev-logs.sh -f api` - Follow API logs only
|
||||
- `./dev-logs.sh -f web` - Follow web logs only
|
||||
- `./dev-shell.sh` - Open shell in API container
|
||||
- `./dev-shell.sh web` - Open shell in web container
|
||||
|
||||
**Debug Features:**
|
||||
- API runs with `--debug --verbose` flags for maximum logging
|
||||
- Web runs with Vite dev server for hot reloading
|
||||
- All containers have `stdin_open: true` and `tty: true` for debugging
|
||||
- Environment variables set for development mode
|
||||
- `./docker-compose.sh up --build` - Start stack (omit `-d` to see logs)
|
||||
- `./docker-compose.sh up --build -d` - Start stack in background
|
||||
- `docker compose down` - Stop all services
|
||||
- `docker compose logs -f` - Follow all logs
|
||||
- `docker compose logs -f api` - Follow API logs only
|
||||
- `docker compose logs -f web` - Follow web logs only
|
||||
- `docker compose exec api sh` - Open shell in API container
|
||||
- `docker compose exec web sh` - Open shell in web container
|
||||
|
||||
## Services
|
||||
|
||||
@@ -84,16 +80,34 @@ This will:
|
||||
- Video recording controls
|
||||
- File management
|
||||
|
||||
### Web Dashboard (Port 5173)
|
||||
### Web Dashboard (Port 8080)
|
||||
|
||||
- User authentication via Supabase
|
||||
- Experiment definition and management
|
||||
- Camera control interface
|
||||
- Video playback and analysis
|
||||
|
||||
### MQTT Broker (Port 1883)
|
||||
### Media API (Port 8090)
|
||||
|
||||
- Local Mosquitto broker for development and integration testing
|
||||
- Video listing, thumbnails, transcoding
|
||||
|
||||
### Video Remote (Port 3001)
|
||||
|
||||
- Video browser UI
|
||||
|
||||
### Vision System Remote (Port 3002)
|
||||
|
||||
- Camera/vision control UI
|
||||
|
||||
### Scheduling Remote (Port 3003)
|
||||
|
||||
- Scheduling/availability UI
|
||||
|
||||
### MediaMTX (Ports 8554, 8889, 8189)
|
||||
|
||||
- RTSP and WebRTC streaming (config: [mediamtx.yml](mediamtx.yml))
|
||||
|
||||
Supabase services are currently commented out in `docker-compose.yml` and can be run via Supabase CLI (e.g. from `management-dashboard-web-app`). See [docs](docs/) for setup.
|
||||
|
||||
## Git Subtree Workflow
|
||||
|
||||
@@ -138,7 +152,7 @@ Notes:
|
||||
- Storage (recordings) is mapped to `camera-management-api/storage/` and ignored by git.
|
||||
|
||||
- Web
|
||||
- Code lives under `management-dashboard-web-app/` with a Vite dev server on port 5173.
|
||||
- Code lives under `management-dashboard-web-app/` with a Vite dev server on port 8080 when run via Docker.
|
||||
- Environment: set `VITE_SUPABASE_URL` and `VITE_SUPABASE_ANON_KEY` in `.env` (not committed).
|
||||
- Common scripts: `npm run dev`, `npm run build` (executed inside the container by compose).
|
||||
|
||||
|
||||
7
archive/management-dashboard-web-app/README.md
Normal file
7
archive/management-dashboard-web-app/README.md
Normal file
@@ -0,0 +1,7 @@
|
||||
# Archive: management-dashboard-web-app legacy/backup files
|
||||
|
||||
Moved from `management-dashboard-web-app/` so the app directory only contains active code and config.
|
||||
|
||||
- **env-backups/** – Old `.env.backup` and timestamped backup (Supabase URL/key). Keep out of version control if they contain secrets.
|
||||
- **experiment-data/** – CSV run sheets: `phase_2_JC_experimental_run_sheet.csv`, `post_workshop_meyer_experiments.csv`. Source/reference data for experiments.
|
||||
- **test-api-fix.js** – One-off test script for camera config API; not part of the app build.
|
||||
@@ -0,0 +1,61 @@
|
||||
experiment_number,soaking_duration_hr,air_drying_duration_min,plate_contact_frequency_hz,throughput_rate_pecans_sec,crush_amount_in,entry_exit_height_diff_in
|
||||
0,34,19,53,28,0.05,-0.09
|
||||
1,24,27,34,29,0.03,0.01
|
||||
12,28,59,37,23,0.06,-0.08
|
||||
15,16,60,30,24,0.07,0.02
|
||||
4,13,41,41,38,0.05,0.03
|
||||
18,18,49,38,35,0.07,-0.08
|
||||
11,24,59,42,25,0.07,-0.05
|
||||
16,20,59,41,14,0.07,0.04
|
||||
4,13,41,41,38,0.05,0.03
|
||||
19,11,25,56,34,0.06,-0.09
|
||||
15,16,60,30,24,0.07,0.02
|
||||
16,20,59,41,14,0.07,0.04
|
||||
10,26,60,44,12,0.08,-0.1
|
||||
1,24,27,34,29,0.03,0.01
|
||||
17,34,60,34,29,0.07,-0.09
|
||||
5,30,33,30,36,0.05,-0.04
|
||||
2,38,10,60,28,0.06,-0.1
|
||||
2,38,10,60,28,0.06,-0.1
|
||||
13,21,59,41,21,0.06,-0.09
|
||||
1,24,27,34,29,0.03,0.01
|
||||
14,22,59,45,17,0.07,-0.08
|
||||
6,10,22,37,30,0.06,0.02
|
||||
11,24,59,42,25,0.07,-0.05
|
||||
19,11,25,56,34,0.06,-0.09
|
||||
8,27,12,55,24,0.04,0.04
|
||||
18,18,49,38,35,0.07,-0.08
|
||||
5,30,33,30,36,0.05,-0.04
|
||||
9,32,26,47,26,0.07,0.03
|
||||
3,11,36,42,13,0.07,-0.07
|
||||
10,26,60,44,12,0.08,-0.1
|
||||
8,27,12,55,24,0.04,0.04
|
||||
5,30,33,30,36,0.05,-0.04
|
||||
8,27,12,55,24,0.04,0.04
|
||||
18,18,49,38,35,0.07,-0.08
|
||||
3,11,36,42,13,0.07,-0.07
|
||||
10,26,60,44,12,0.08,-0.1
|
||||
17,34,60,34,29,0.07,-0.09
|
||||
13,21,59,41,21,0.06,-0.09
|
||||
12,28,59,37,23,0.06,-0.08
|
||||
9,32,26,47,26,0.07,0.03
|
||||
14,22,59,45,17,0.07,-0.08
|
||||
0,34,19,53,28,0.05,-0.09
|
||||
7,15,30,35,32,0.05,-0.07
|
||||
0,34,19,53,28,0.05,-0.09
|
||||
15,16,60,30,24,0.07,0.02
|
||||
13,21,59,41,21,0.06,-0.09
|
||||
11,24,59,42,25,0.07,-0.05
|
||||
7,15,30,35,32,0.05,-0.07
|
||||
16,20,59,41,14,0.07,0.04
|
||||
3,11,36,42,13,0.07,-0.07
|
||||
7,15,30,35,32,0.05,-0.07
|
||||
6,10,22,37,30,0.06,0.02
|
||||
19,11,25,56,34,0.06,-0.09
|
||||
6,10,22,37,30,0.06,0.02
|
||||
2,38,10,60,28,0.06,-0.1
|
||||
14,22,59,45,17,0.07,-0.08
|
||||
4,13,41,41,38,0.05,0.03
|
||||
9,32,26,47,26,0.07,0.03
|
||||
17,34,60,34,29,0.07,-0.09
|
||||
12,28,59,37,23,0.06,-0.08
|
||||
|
@@ -0,0 +1,41 @@
|
||||
phase_name,machine_type,Motor Speed (Hz),soaking_duration_hr,air_drying_duration_min,jig Displacement (in),Spring Stiffness (N/m),reps_required,rep
|
||||
"Post Workshop Meyer Experiments","Meyer Cracker",33,27,28,-0.307,1800,1,1
|
||||
"Post Workshop Meyer Experiments","Meyer Cracker",30,37,17,-0.311,2000,1,1
|
||||
"Post Workshop Meyer Experiments","Meyer Cracker",47,36,50,-0.291,1800,1,1
|
||||
"Post Workshop Meyer Experiments","Meyer Cracker",42,12,30,-0.314,2000,1,1
|
||||
"Post Workshop Meyer Experiments","Meyer Cracker",53,34,19,-0.302,1800,1,1
|
||||
"Post Workshop Meyer Experiments","Meyer Cracker",37,18,40,-0.301,2200,1,1
|
||||
"Post Workshop Meyer Experiments","Meyer Cracker",40,14,59,-0.286,2000,1,1
|
||||
"Post Workshop Meyer Experiments","Meyer Cracker",39,18,32,-0.309,1800,1,1
|
||||
"Post Workshop Meyer Experiments","Meyer Cracker",49,11,31,-0.299,2200,1,1
|
||||
"Post Workshop Meyer Experiments","Meyer Cracker",47,33,12,-0.295,2000,1,1
|
||||
"Post Workshop Meyer Experiments","Meyer Cracker",52,23,36,-0.302,2000,1,1
|
||||
"Post Workshop Meyer Experiments","Meyer Cracker",59,37,35,-0.299,1800,1,1
|
||||
"Post Workshop Meyer Experiments","Meyer Cracker",41,15,15,-0.312,2000,1,1
|
||||
"Post Workshop Meyer Experiments","Meyer Cracker",46,24,22,-0.303,1800,1,1
|
||||
"Post Workshop Meyer Experiments","Meyer Cracker",50,36,15,-0.308,1800,1,1
|
||||
"Post Workshop Meyer Experiments","Meyer Cracker",36,32,48,-0.306,2200,1,1
|
||||
"Post Workshop Meyer Experiments","Meyer Cracker",33,28,38,-0.308,2200,1,1
|
||||
"Post Workshop Meyer Experiments","Meyer Cracker",35,31,51,-0.311,1800,1,1
|
||||
"Post Workshop Meyer Experiments","Meyer Cracker",55,20,57,-0.304,2000,1,1
|
||||
"Post Workshop Meyer Experiments","Meyer Cracker",44,10,27,-0.313,2200,1,1
|
||||
"Post Workshop Meyer Experiments","Meyer Cracker",37,16,43,-0.294,2000,1,1
|
||||
"Post Workshop Meyer Experiments","Meyer Cracker",56,25,42,-0.31,2200,1,1
|
||||
"Post Workshop Meyer Experiments","Meyer Cracker",30,13,21,-0.292,2200,1,1
|
||||
"Post Workshop Meyer Experiments","Meyer Cracker",60,29,46,-0.294,2200,1,1
|
||||
"Post Workshop Meyer Experiments","Meyer Cracker",41,21,54,-0.306,2000,1,1
|
||||
"Post Workshop Meyer Experiments","Meyer Cracker",55,29,54,-0.296,1800,1,1
|
||||
"Post Workshop Meyer Experiments","Meyer Cracker",39,30,48,-0.293,2200,1,1
|
||||
"Post Workshop Meyer Experiments","Meyer Cracker",34,35,53,-0.285,2200,1,1
|
||||
"Post Workshop Meyer Experiments","Meyer Cracker",57,32,39,-0.291,1800,1,1
|
||||
"Post Workshop Meyer Experiments","Meyer Cracker",45,27,38,-0.296,2200,1,1
|
||||
"Post Workshop Meyer Experiments","Meyer Cracker",52,17,25,-0.297,1800,1,1
|
||||
"Post Workshop Meyer Experiments","Meyer Cracker",51,13,22,-0.288,2200,1,1
|
||||
"Post Workshop Meyer Experiments","Meyer Cracker",36,19,11,-0.29,2000,1,1
|
||||
"Post Workshop Meyer Experiments","Meyer Cracker",44,38,32,-0.315,1800,1,1
|
||||
"Post Workshop Meyer Experiments","Meyer Cracker",58,26,18,-0.289,1800,1,1
|
||||
"Post Workshop Meyer Experiments","Meyer Cracker",32,22,52,-0.288,1800,1,1
|
||||
"Post Workshop Meyer Experiments","Meyer Cracker",43,12,56,-0.287,2200,1,1
|
||||
"Post Workshop Meyer Experiments","Meyer Cracker",60,16,45,-0.298,2200,1,1
|
||||
"Post Workshop Meyer Experiments","Meyer Cracker",54,22,25,-0.301,2000,1,1
|
||||
"Post Workshop Meyer Experiments","Meyer Cracker",48,24,13,-0.305,2000,1,1
|
||||
|
132
archive/management-dashboard-web-app/test-api-fix.js
Normal file
132
archive/management-dashboard-web-app/test-api-fix.js
Normal file
@@ -0,0 +1,132 @@
|
||||
// Test script to verify the camera configuration API fix
|
||||
// This simulates the VisionApiClient.getCameraConfig method
|
||||
|
||||
class TestVisionApiClient {
|
||||
constructor() {
|
||||
this.baseUrl = 'http://localhost:8000'
|
||||
}
|
||||
|
||||
async request(endpoint) {
|
||||
const response = await fetch(`${this.baseUrl}${endpoint}`, {
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
}
|
||||
})
|
||||
|
||||
if (!response.ok) {
|
||||
const errorText = await response.text()
|
||||
throw new Error(`HTTP ${response.status}: ${response.statusText}\n${errorText}`)
|
||||
}
|
||||
|
||||
return response.json()
|
||||
}
|
||||
|
||||
// This is our fixed getCameraConfig method
|
||||
async getCameraConfig(cameraName) {
|
||||
try {
|
||||
const config = await this.request(`/cameras/${cameraName}/config`)
|
||||
|
||||
// Ensure auto-recording fields have default values if missing
|
||||
return {
|
||||
...config,
|
||||
auto_start_recording_enabled: config.auto_start_recording_enabled ?? false,
|
||||
auto_recording_max_retries: config.auto_recording_max_retries ?? 3,
|
||||
auto_recording_retry_delay_seconds: config.auto_recording_retry_delay_seconds ?? 5
|
||||
}
|
||||
} catch (error) {
|
||||
// If the error is related to missing auto-recording fields, try to handle it gracefully
|
||||
if (error.message?.includes('auto_start_recording_enabled') ||
|
||||
error.message?.includes('auto_recording_max_retries') ||
|
||||
error.message?.includes('auto_recording_retry_delay_seconds')) {
|
||||
|
||||
// Try to get the raw camera data and add default auto-recording fields
|
||||
try {
|
||||
const response = await fetch(`${this.baseUrl}/cameras/${cameraName}/config`, {
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
}
|
||||
})
|
||||
|
||||
if (!response.ok) {
|
||||
throw new Error(`HTTP ${response.status}: ${response.statusText}`)
|
||||
}
|
||||
|
||||
const rawConfig = await response.json()
|
||||
|
||||
// Add missing auto-recording fields with defaults
|
||||
return {
|
||||
...rawConfig,
|
||||
auto_start_recording_enabled: false,
|
||||
auto_recording_max_retries: 3,
|
||||
auto_recording_retry_delay_seconds: 5
|
||||
}
|
||||
} catch (fallbackError) {
|
||||
throw new Error(`Failed to load camera configuration: ${error.message}`)
|
||||
}
|
||||
}
|
||||
|
||||
throw error
|
||||
}
|
||||
}
|
||||
|
||||
async getCameras() {
|
||||
return this.request('/cameras')
|
||||
}
|
||||
}
|
||||
|
||||
// Test function
|
||||
async function testCameraConfigFix() {
|
||||
console.log('🧪 Testing Camera Configuration API Fix')
|
||||
console.log('=' * 50)
|
||||
|
||||
const api = new TestVisionApiClient()
|
||||
|
||||
try {
|
||||
// First get available cameras
|
||||
console.log('📋 Getting camera list...')
|
||||
const cameras = await api.getCameras()
|
||||
const cameraNames = Object.keys(cameras)
|
||||
|
||||
if (cameraNames.length === 0) {
|
||||
console.log('❌ No cameras found')
|
||||
return
|
||||
}
|
||||
|
||||
console.log(`✅ Found ${cameraNames.length} cameras: ${cameraNames.join(', ')}`)
|
||||
|
||||
// Test configuration for each camera
|
||||
for (const cameraName of cameraNames) {
|
||||
console.log(`\n🎥 Testing configuration for ${cameraName}...`)
|
||||
|
||||
try {
|
||||
const config = await api.getCameraConfig(cameraName)
|
||||
|
||||
console.log(`✅ Configuration loaded successfully for ${cameraName}`)
|
||||
console.log(` - auto_start_recording_enabled: ${config.auto_start_recording_enabled}`)
|
||||
console.log(` - auto_recording_max_retries: ${config.auto_recording_max_retries}`)
|
||||
console.log(` - auto_recording_retry_delay_seconds: ${config.auto_recording_retry_delay_seconds}`)
|
||||
console.log(` - exposure_ms: ${config.exposure_ms}`)
|
||||
console.log(` - gain: ${config.gain}`)
|
||||
|
||||
} catch (error) {
|
||||
console.log(`❌ Configuration failed for ${cameraName}: ${error.message}`)
|
||||
}
|
||||
}
|
||||
|
||||
console.log('\n🎉 Camera configuration API test completed!')
|
||||
|
||||
} catch (error) {
|
||||
console.log(`❌ Test failed: ${error.message}`)
|
||||
}
|
||||
}
|
||||
|
||||
// Export for use in browser console or Node.js
|
||||
if (typeof module !== 'undefined' && module.exports) {
|
||||
module.exports = { TestVisionApiClient, testCameraConfigFix }
|
||||
} else {
|
||||
// Browser environment
|
||||
window.TestVisionApiClient = TestVisionApiClient
|
||||
window.testCameraConfigFix = testCameraConfigFix
|
||||
}
|
||||
|
||||
console.log('📝 Test script loaded. Run testCameraConfigFix() to test the fix.')
|
||||
@@ -2,10 +2,10 @@ networks:
|
||||
usda-vision-network:
|
||||
driver: bridge
|
||||
|
||||
volumes:
|
||||
supabase-db:
|
||||
driver: local
|
||||
supabase-storage:
|
||||
# volumes:
|
||||
# supabase-db:
|
||||
# driver: local
|
||||
# supabase-storage:
|
||||
|
||||
services:
|
||||
# ============================================================================
|
||||
@@ -17,7 +17,7 @@ services:
|
||||
# - Filter by label: docker compose ps --filter "label=com.usda-vision.service=supabase"
|
||||
# - Or use service names: docker compose ps supabase-*
|
||||
#
|
||||
# NOTE: Currently commented out to test Supabase CLI setup from management-dashboard-web-app
|
||||
# NOTE: Supabase CLI and docker-compose use root supabase/
|
||||
|
||||
# # # Supabase Database
|
||||
# # supabase-db:
|
||||
@@ -166,7 +166,7 @@ services:
|
||||
# supabase-rest:
|
||||
# condition: service_started
|
||||
# environment:
|
||||
# ANON_KEY: ${ANON_KEY:-[REDACTED]}
|
||||
# ANON_KEY: ${ANON_KEY:-eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJzdXBhYmFzZS1kZW1vIiwicm9sZSI6ImFub24iLCJleHAiOjE5ODM4MTI5OTZ9.CRXP1A7WOeoJeXxjNni43kdQwgnWNReilDMblYTn_I0}
|
||||
# SERVICE_KEY: ${SERVICE_KEY:-eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJzdXBhYmFzZS1kZW1vIiwicm9sZSI6InNlcnZpY2Vfcm9sZSIsImV4cCI6MTk4MzgxMjk5Nn0.EGIM96RAZx35lJzdJsyH-qQwv8Hdp7fsn3W0YpN81IU}
|
||||
# POSTGREST_URL: http://supabase-rest:3000
|
||||
# PGRST_JWT_SECRET: ${JWT_SECRET:-super-secret-jwt-token-with-at-least-32-characters-long}
|
||||
@@ -205,7 +205,7 @@ services:
|
||||
# DEFAULT_PROJECT_NAME: Default Project
|
||||
# SUPABASE_URL: http://supabase-rest:3000
|
||||
# SUPABASE_PUBLIC_URL: http://localhost:54321
|
||||
# SUPABASE_ANON_KEY: ${ANON_KEY:-[REDACTED]}
|
||||
# SUPABASE_ANON_KEY: ${ANON_KEY:-eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJzdXBhYmFzZS1kZW1vIiwicm9sZSI6ImFub24iLCJleHAiOjE5ODM4MTI5OTZ9.CRXP1A7WOeoJeXxjNni43kdQwgnWNReilDMblYTn_I0}
|
||||
# SUPABASE_SERVICE_KEY: ${SERVICE_KEY:-eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJzdXBhYmFzZS1kZW1vIiwicm9sZSI6InNlcnZpY2Vfcm9sZSIsImV4cCI6MTk4MzgxMjk5Nn0.EGIM96RAZx35lJzdJsyH-qQwv8Hdp7fsn3W0YpN81IU}
|
||||
# ports:
|
||||
# - "54323:3000"
|
||||
@@ -400,6 +400,8 @@ services:
|
||||
video-remote:
|
||||
container_name: usda-vision-video-remote
|
||||
image: node:20-alpine
|
||||
tty: true
|
||||
stdin_open: true
|
||||
working_dir: /app
|
||||
environment:
|
||||
- CHOKIDAR_USEPOLLING=true
|
||||
@@ -424,6 +426,8 @@ services:
|
||||
vision-system-remote:
|
||||
container_name: usda-vision-vision-system-remote
|
||||
image: node:20-alpine
|
||||
tty: true
|
||||
stdin_open: true
|
||||
working_dir: /app
|
||||
environment:
|
||||
- CHOKIDAR_USEPOLLING=true
|
||||
@@ -447,6 +451,8 @@ services:
|
||||
scheduling-remote:
|
||||
container_name: usda-vision-scheduling-remote
|
||||
image: node:20-alpine
|
||||
tty: true
|
||||
stdin_open: true
|
||||
working_dir: /app
|
||||
env_file:
|
||||
- ./management-dashboard-web-app/.env
|
||||
@@ -466,6 +472,14 @@ services:
|
||||
- "3003:3003"
|
||||
networks:
|
||||
- usda-vision-network
|
||||
develop:
|
||||
watch:
|
||||
- path: ./scheduling-remote
|
||||
action: restart
|
||||
ignore:
|
||||
- node_modules/
|
||||
- dist/
|
||||
- .git/
|
||||
|
||||
media-api:
|
||||
container_name: usda-vision-media-api
|
||||
|
||||
@@ -7,6 +7,7 @@ Your current design has a **fundamental flaw** that prevents it from working cor
|
||||
### The Problem
|
||||
|
||||
The phase tables (`soaking`, `airdrying`, `cracking`, `shelling`) have this constraint:
|
||||
|
||||
```sql
|
||||
CONSTRAINT unique_soaking_per_experiment UNIQUE (experiment_id)
|
||||
```
|
||||
@@ -16,7 +17,8 @@ This means you can **only have ONE soaking record per experiment**, even if you
|
||||
### Why This Happens
|
||||
|
||||
When you create an experiment with 3 repetitions:
|
||||
1. ✅ 3 rows are created in `experiment_repetitions`
|
||||
|
||||
1. ✅ 3 rows are created in `experiment_repetitions`
|
||||
2. ❌ But you can only create 1 row in `soaking` (due to UNIQUE constraint)
|
||||
3. ❌ The other 2 repetitions cannot have soaking data!
|
||||
|
||||
@@ -25,6 +27,7 @@ When you create an experiment with 3 repetitions:
|
||||
### Current Approach: Separate Tables (❌ Not Recommended)
|
||||
|
||||
**Problems:**
|
||||
|
||||
- ❌ UNIQUE constraint breaks repetitions
|
||||
- ❌ Schema duplication (same structure 4 times)
|
||||
- ❌ Hard to query "all phases for a repetition"
|
||||
@@ -34,6 +37,7 @@ When you create an experiment with 3 repetitions:
|
||||
### Recommended Approach: Unified Table (✅ Best Practice)
|
||||
|
||||
**Benefits:**
|
||||
|
||||
- ✅ Properly supports repetitions (one phase per repetition)
|
||||
- ✅ Automatic phase creation via database trigger
|
||||
- ✅ Simple sequential time calculations
|
||||
@@ -45,7 +49,7 @@ When you create an experiment with 3 repetitions:
|
||||
|
||||
I've created a migration file that implements a **unified `experiment_phase_executions` table**:
|
||||
|
||||
### Key Features:
|
||||
### Key Features
|
||||
|
||||
1. **Single Table for All Phases**
|
||||
- Uses `phase_type` enum to distinguish phases
|
||||
@@ -68,7 +72,8 @@ I've created a migration file that implements a **unified `experiment_phase_exec
|
||||
## Files Created
|
||||
|
||||
1. **`docs/database_design_analysis.md`** - Detailed analysis with comparison matrix
|
||||
2. **`management-dashboard-web-app/supabase/migrations/00012_unified_phase_executions.sql`** - Complete migration implementation
|
||||
3. **`supabase/migrations/00012_unified_phase_executions.sql`** - Complete migration implementation
|
||||
|
||||
|
||||
## Migration Path
|
||||
|
||||
@@ -81,6 +86,7 @@ I've created a migration file that implements a **unified `experiment_phase_exec
|
||||
## Alternative: Fix Current Design
|
||||
|
||||
If you prefer to keep separate tables, you MUST:
|
||||
|
||||
1. Remove `UNIQUE (experiment_id)` constraints from all phase tables
|
||||
2. Keep only `UNIQUE (repetition_id)` constraints
|
||||
3. Add trigger to auto-create phase entries when repetitions are created
|
||||
@@ -91,4 +97,3 @@ However, this still has the drawbacks of schema duplication and complexity.
|
||||
## Recommendation
|
||||
|
||||
**Use the unified table approach** - it's cleaner, more maintainable, and properly supports your repetition model.
|
||||
|
||||
|
||||
@@ -5,6 +5,7 @@ The Supabase containers are now integrated into the main `docker-compose.yml` fi
|
||||
## What Changed
|
||||
|
||||
All Supabase services are now defined in the root `docker-compose.yml`:
|
||||
|
||||
- **supabase-db**: PostgreSQL database (port 54322)
|
||||
- **supabase-rest**: PostgREST API (port 54321)
|
||||
- **supabase-auth**: GoTrue authentication service (port 9999)
|
||||
@@ -48,6 +49,7 @@ VITE_SUPABASE_ANON_KEY=<your-anon-key>
|
||||
```
|
||||
|
||||
The default anon key for local development is:
|
||||
|
||||
```
|
||||
[REDACTED]
|
||||
```
|
||||
@@ -55,20 +57,22 @@ The default anon key for local development is:
|
||||
### Migrations
|
||||
|
||||
Migrations are automatically run on first startup via the `supabase-migrate` service. The service:
|
||||
|
||||
1. Waits for the database to be ready
|
||||
2. Runs all migrations from `supabase/migrations/` in alphabetical order
|
||||
3. Runs seed files (`seed_01_users.sql` and `seed_02_phase2_experiments.sql`)
|
||||
|
||||
If you need to re-run migrations, you can:
|
||||
|
||||
1. Stop the containers: `docker compose down`
|
||||
2. Remove the database volume: `docker volume rm usda-vision_supabase-db`
|
||||
3. Start again: `docker compose up -d`
|
||||
|
||||
### Accessing Services
|
||||
|
||||
- **Supabase API**: http://localhost:54321
|
||||
- **Supabase Studio**: http://localhost:54323
|
||||
- **Email Testing (Inbucket)**: http://localhost:54324
|
||||
- **Supabase API**: <http://localhost:54321>
|
||||
- **Supabase Studio**: <http://localhost:54323>
|
||||
- **Email Testing (Inbucket)**: <http://localhost:54324>
|
||||
- **Database (direct)**: localhost:54322
|
||||
|
||||
### Network
|
||||
@@ -88,12 +92,14 @@ If you were previously using `supabase start` from the `management-dashboard-web
|
||||
### Port Conflicts
|
||||
|
||||
If you get port conflicts, make sure:
|
||||
|
||||
- No other Supabase instances are running
|
||||
- The Supabase CLI isn't running containers (`supabase stop` if needed)
|
||||
|
||||
### Migration Issues
|
||||
|
||||
If migrations fail:
|
||||
|
||||
1. Check the logs: `docker compose logs supabase-migrate`
|
||||
2. Ensure migration files are valid SQL
|
||||
3. You may need to manually connect to the database and fix issues
|
||||
@@ -101,7 +107,7 @@ If migrations fail:
|
||||
### Database Connection Issues
|
||||
|
||||
If services can't connect to the database:
|
||||
|
||||
1. Check database is healthy: `docker compose ps supabase-db`
|
||||
2. Check logs: `docker compose logs supabase-db`
|
||||
3. Ensure the database password matches across all services
|
||||
|
||||
|
||||
@@ -25,6 +25,7 @@ docker compose up -d
|
||||
### For Supabase CLI Users
|
||||
|
||||
**Before** (old way):
|
||||
|
||||
```bash
|
||||
cd management-dashboard-web-app
|
||||
supabase start
|
||||
@@ -32,6 +33,7 @@ supabase db reset
|
||||
```
|
||||
|
||||
**After** (new way):
|
||||
|
||||
```bash
|
||||
# From project root - no need to cd!
|
||||
supabase start
|
||||
@@ -50,23 +52,22 @@ If you have scripts or documentation that reference the old path, update them:
|
||||
- ❌ `management-dashboard-web-app/supabase/config.toml`
|
||||
- ✅ `supabase/config.toml`
|
||||
|
||||
## Backward Compatibility
|
||||
|
||||
The old directory (`management-dashboard-web-app/supabase/`) can be kept for reference, but it's no longer used by docker-compose or the Supabase CLI. You can safely remove it after verifying everything works:
|
||||
## Current State
|
||||
|
||||
```bash
|
||||
# After verifying everything works with the new location
|
||||
rm -rf management-dashboard-web-app/supabase
|
||||
```
|
||||
The old directory (`management-dashboard-web-app/supabase/`) has been removed. All Supabase and DB configuration, migrations, and seeds now live only under the project root `supabase/` directory. Docker Compose and the Supabase CLI use root `supabase/` exclusively.
|
||||
|
||||
## Verification
|
||||
|
||||
To verify the migration worked:
|
||||
To verify the migration:
|
||||
|
||||
1. **Check docker-compose paths** (only root supabase should be referenced):
|
||||
|
||||
1. **Check docker-compose paths**:
|
||||
```bash
|
||||
grep -r "supabase" docker-compose.yml
|
||||
# Should show: ./supabase/ (not ./management-dashboard-web-app/supabase/)
|
||||
grep "supabase" docker-compose.yml
|
||||
# Should show only ./supabase/ (no management-dashboard-web-app/supabase/)
|
||||
|
||||
|
||||
```
|
||||
|
||||
2. **Test Supabase CLI**:
|
||||
@@ -76,7 +77,8 @@ To verify the migration worked:
|
||||
# Should work without needing to cd into management-dashboard-web-app
|
||||
```
|
||||
|
||||
3. **Test migrations**:
|
||||
1. **Test migrations**:
|
||||
|
||||
```bash
|
||||
docker compose up -d
|
||||
docker compose logs supabase-migrate
|
||||
@@ -90,4 +92,3 @@ To verify the migration worked:
|
||||
✅ Easier to share database across services
|
||||
✅ Better alignment with monorepo best practices
|
||||
✅ Infrastructure separated from application code
|
||||
|
||||
|
||||
303
docs/database_entities.md
Normal file
303
docs/database_entities.md
Normal file
@@ -0,0 +1,303 @@
|
||||
# Database Entities Documentation
|
||||
|
||||
This document describes the core entities in the USDA Vision database schema, focusing on entity-specific attributes (excluding generic fields like `id`, `created_at`, `updated_at`, `created_by`).
|
||||
|
||||
## Entity Relationships Overview
|
||||
|
||||
```
|
||||
Experiment Phase (Template)
|
||||
↓
|
||||
Experiment
|
||||
↓
|
||||
Experiment Repetition
|
||||
↓
|
||||
Experiment Phase Execution (Soaking, Airdrying, Cracking, Shelling)
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## 1. Experiment Phase
|
||||
|
||||
**Table:** `experiment_phases`
|
||||
|
||||
**Purpose:** Defines a template/blueprint for experiments that specifies which processing phases are included and their configuration.
|
||||
|
||||
### Attributes
|
||||
|
||||
- **name** (TEXT, UNIQUE, NOT NULL)
|
||||
- Unique name identifying the experiment phase template
|
||||
- Example: "Phase 2 - Standard Processing"
|
||||
|
||||
- **description** (TEXT, nullable)
|
||||
- Optional description providing details about the experiment phase
|
||||
|
||||
- **has_soaking** (BOOLEAN, NOT NULL, DEFAULT false)
|
||||
- Indicates whether soaking phase is included in experiments using this template
|
||||
|
||||
- **has_airdrying** (BOOLEAN, NOT NULL, DEFAULT false)
|
||||
- Indicates whether airdrying phase is included in experiments using this template
|
||||
|
||||
- **has_cracking** (BOOLEAN, NOT NULL, DEFAULT false)
|
||||
- Indicates whether cracking phase is included in experiments using this template
|
||||
|
||||
- **has_shelling** (BOOLEAN, NOT NULL, DEFAULT false)
|
||||
- Indicates whether shelling phase is included in experiments using this template
|
||||
|
||||
- **cracking_machine_type_id** (UUID, nullable)
|
||||
- References the machine type to be used for cracking (required if `has_cracking` is true)
|
||||
- Links to `machine_types` table
|
||||
|
||||
### Constraints
|
||||
|
||||
- At least one phase (soaking, airdrying, cracking, or shelling) must be enabled
|
||||
- If `has_cracking` is true, `cracking_machine_type_id` must be provided
|
||||
|
||||
---
|
||||
|
||||
## 2. Experiment
|
||||
|
||||
**Table:** `experiments`
|
||||
|
||||
**Purpose:** Defines an experiment blueprint that specifies the parameters and requirements for conducting pecan processing experiments.
|
||||
|
||||
### Attributes
|
||||
|
||||
- **experiment_number** (INTEGER, NOT NULL)
|
||||
- Unique number identifying the experiment
|
||||
- Combined with `phase_id` must be unique
|
||||
|
||||
- **reps_required** (INTEGER, NOT NULL, CHECK > 0)
|
||||
- Number of repetitions required for this experiment
|
||||
- Must be greater than zero
|
||||
|
||||
- **weight_per_repetition_lbs** (DOUBLE PRECISION, NOT NULL, DEFAULT 5.0, CHECK > 0)
|
||||
- Weight in pounds required for each repetition of the experiment
|
||||
|
||||
- **results_status** (TEXT, NOT NULL, DEFAULT 'valid', CHECK IN ('valid', 'invalid'))
|
||||
- Status indicating whether the experiment results are considered valid or invalid
|
||||
|
||||
- **completion_status** (BOOLEAN, NOT NULL, DEFAULT false)
|
||||
- Indicates whether the experiment has been completed
|
||||
|
||||
- **phase_id** (UUID, NOT NULL)
|
||||
- References the experiment phase template this experiment belongs to
|
||||
- Links to `experiment_phases` table
|
||||
|
||||
### Constraints
|
||||
|
||||
- Combination of `experiment_number` and `phase_id` must be unique
|
||||
|
||||
---
|
||||
|
||||
## 3. Experiment Repetition
|
||||
|
||||
**Table:** `experiment_repetitions`
|
||||
|
||||
**Purpose:** Represents a single execution instance of an experiment that can be scheduled and tracked.
|
||||
|
||||
### Attributes
|
||||
|
||||
- **experiment_id** (UUID, NOT NULL)
|
||||
- References the parent experiment blueprint
|
||||
- Links to `experiments` table
|
||||
|
||||
- **repetition_number** (INTEGER, NOT NULL, CHECK > 0)
|
||||
- Sequential number identifying this repetition within the experiment
|
||||
- Must be unique per experiment
|
||||
|
||||
- **scheduled_date** (TIMESTAMP WITH TIME ZONE, nullable)
|
||||
- Date and time when the repetition is scheduled to be executed
|
||||
|
||||
- **status** (TEXT, NOT NULL, DEFAULT 'pending', CHECK IN ('pending', 'in_progress', 'completed', 'cancelled'))
|
||||
- Current status of the repetition execution
|
||||
- Values: `pending`, `in_progress`, `completed`, `cancelled`
|
||||
|
||||
### Constraints
|
||||
|
||||
- Combination of `experiment_id` and `repetition_number` must be unique
|
||||
|
||||
---
|
||||
|
||||
## 4. Experiment Phase Executions
|
||||
|
||||
**Table:** `experiment_phase_executions`
|
||||
|
||||
**Purpose:** Unified table storing execution data for all phase types (soaking, airdrying, cracking, shelling) associated with experiment repetitions.
|
||||
|
||||
### Common Attributes (All Phase Types)
|
||||
|
||||
- **repetition_id** (UUID, NOT NULL)
|
||||
- References the experiment repetition this phase execution belongs to
|
||||
- Links to `experiment_repetitions` table
|
||||
|
||||
- **phase_type** (TEXT, NOT NULL, CHECK IN ('soaking', 'airdrying', 'cracking', 'shelling'))
|
||||
- Type of phase being executed
|
||||
- Must be one of: `soaking`, `airdrying`, `cracking`, `shelling`
|
||||
|
||||
- **scheduled_start_time** (TIMESTAMP WITH TIME ZONE, NOT NULL)
|
||||
- Planned start time for the phase execution
|
||||
|
||||
- **scheduled_end_time** (TIMESTAMP WITH TIME ZONE, nullable)
|
||||
- Planned end time for the phase execution
|
||||
- Automatically calculated for soaking and airdrying based on duration
|
||||
|
||||
- **actual_start_time** (TIMESTAMP WITH TIME ZONE, nullable)
|
||||
- Actual time when the phase execution started
|
||||
|
||||
- **actual_end_time** (TIMESTAMP WITH TIME ZONE, nullable)
|
||||
- Actual time when the phase execution ended
|
||||
|
||||
- **status** (TEXT, NOT NULL, DEFAULT 'pending', CHECK IN ('pending', 'scheduled', 'in_progress', 'completed', 'cancelled'))
|
||||
- Current status of the phase execution
|
||||
|
||||
### Phase-Specific Concepts: Independent & Dependent Variables
|
||||
|
||||
> **Note:** This section describes the conceptual variables for each phase (what we measure or control), not necessarily the current physical columns in the database. Some of these variables will be added to the schema in future migrations.
|
||||
|
||||
#### Soaking Phase
|
||||
|
||||
- **Independent variables (IV)**
|
||||
- **Pre-soaking shell moisture percentage**
|
||||
- Moisture percentage of the shell **before soaking**.
|
||||
- **Pre-soaking kernel moisture percentage**
|
||||
- Moisture percentage of the kernel **before soaking**.
|
||||
- **Average pecan diameter (inches)**
|
||||
- Average diameter of pecans in the batch, measured in inches.
|
||||
- **Batch weight**
|
||||
- Total weight of the batch being soaked.
|
||||
- **Soaking duration (minutes)**
|
||||
- Duration of the soaking process in minutes (currently represented as `soaking_duration_minutes`).
|
||||
|
||||
- **Dependent variables (DV)**
|
||||
- **Post-soaking shell moisture percentage**
|
||||
- Moisture percentage of the shell **after soaking**.
|
||||
- **Post-soaking kernel moisture percentage**
|
||||
- Moisture percentage of the kernel **after soaking**.
|
||||
|
||||
#### Airdrying Phase
|
||||
|
||||
- **Independent variables (IV)**
|
||||
- **Airdrying duration (minutes)**
|
||||
- Duration of the airdrying process in minutes (currently represented as `duration_minutes`).
|
||||
|
||||
- **Dependent variables (DV)**
|
||||
- *(TBD — moisture/other measurements after airdrying can be added here when finalized.)*
|
||||
|
||||
#### Cracking Phase
|
||||
|
||||
- **Independent variables (IV)**
|
||||
- **Cracking machine type**
|
||||
- The type of cracking machine used (linked via `machine_type_id` and `experiment_phases.cracking_machine_type_id`).
|
||||
|
||||
- **Dependent variables (DV)**
|
||||
- *None defined yet for cracking.*
|
||||
Business/analysis metrics for cracking can be added later (e.g., crack quality, breakage rates).
|
||||
|
||||
#### Shelling Phase
|
||||
|
||||
- **Independent variables (IV)**
|
||||
- **Shelling machine configuration parameters** (not yet present in the DB schema)
|
||||
- **Ring gap (inches)**
|
||||
- Radial gap setting of the shelling ring (e.g., `0.34` inches).
|
||||
- **Paddle RPM**
|
||||
- Rotational speed of the paddles (integer RPM value).
|
||||
- **Third machine parameter (TBD)**
|
||||
- The shelling machine expects a third configuration parameter that will be defined and added to the schema later.
|
||||
|
||||
- **Dependent variables (DV)**
|
||||
- **Half-yield ratio (percentage)**
|
||||
- Percentage of the shelled product that is composed of half kernels, relative to total yield.
|
||||
|
||||
### Constraints
|
||||
|
||||
- Combination of `repetition_id` and `phase_type` must be unique (one execution per phase type per repetition)
|
||||
|
||||
### Notes
|
||||
|
||||
- Phase executions are automatically created when an experiment repetition is created, based on the experiment phase template configuration
|
||||
- Sequential phases (soaking → airdrying → cracking → shelling) automatically calculate their `scheduled_start_time` based on the previous phase's `scheduled_end_time`
|
||||
- For soaking and airdrying phases, `scheduled_end_time` is automatically calculated from `scheduled_start_time` + duration
|
||||
|
||||
---
|
||||
|
||||
## 5. Machine Types
|
||||
|
||||
**Table:** `machine_types`
|
||||
|
||||
**Purpose:** Defines the types of machines available for cracking operations.
|
||||
|
||||
### Attributes
|
||||
|
||||
- **name** (TEXT, UNIQUE, NOT NULL)
|
||||
- Unique name identifying the machine type
|
||||
- Example: "JC Cracker", "Meyer Cracker"
|
||||
|
||||
- **description** (TEXT, nullable)
|
||||
- Optional description of the machine type
|
||||
|
||||
### Related Tables
|
||||
|
||||
Machine types are referenced by:
|
||||
- `experiment_phases.cracking_machine_type_id` - Defines which machine type to use for cracking in an experiment phase template
|
||||
- `experiment_phase_executions.machine_type_id` - Specifies which machine was used for a specific cracking execution
|
||||
|
||||
---
|
||||
|
||||
## 6. Cracker Parameters (Machine-Specific)
|
||||
|
||||
### JC Cracker Parameters
|
||||
|
||||
**Table:** `jc_cracker_parameters`
|
||||
|
||||
**Purpose:** Stores parameters specific to JC Cracker machines.
|
||||
|
||||
#### Attributes
|
||||
|
||||
- **plate_contact_frequency_hz** (DOUBLE PRECISION, NOT NULL, CHECK > 0)
|
||||
- Frequency of plate contact in Hertz
|
||||
|
||||
- **throughput_rate_pecans_sec** (DOUBLE PRECISION, NOT NULL, CHECK > 0)
|
||||
- Rate of pecan processing in pecans per second
|
||||
|
||||
- **crush_amount_in** (DOUBLE PRECISION, NOT NULL, CHECK >= 0)
|
||||
- Amount of crushing in inches
|
||||
|
||||
- **entry_exit_height_diff_in** (DOUBLE PRECISION, NOT NULL)
|
||||
- Difference in height between entry and exit points in inches
|
||||
|
||||
### Meyer Cracker Parameters
|
||||
|
||||
**Table:** `meyer_cracker_parameters`
|
||||
|
||||
**Purpose:** Stores parameters specific to Meyer Cracker machines.
|
||||
|
||||
#### Attributes
|
||||
|
||||
- **motor_speed_hz** (DOUBLE PRECISION, NOT NULL, CHECK > 0)
|
||||
- Motor speed in Hertz
|
||||
|
||||
- **jig_displacement_inches** (DOUBLE PRECISION, NOT NULL)
|
||||
- Jig displacement in inches
|
||||
|
||||
- **spring_stiffness_nm** (DOUBLE PRECISION, NOT NULL, CHECK > 0)
|
||||
- Spring stiffness in Newton-meters
|
||||
|
||||
---
|
||||
|
||||
## Summary of Entity Relationships
|
||||
|
||||
1. **Experiment Phase** → Defines which phases are included and machine type for cracking
|
||||
2. **Experiment** → Belongs to an Experiment Phase, defines repetition requirements and weight per repetition
|
||||
3. **Experiment Repetition** → Instance of an Experiment, can be scheduled and tracked
|
||||
4. **Experiment Phase Execution** → Execution record for each phase (soaking, airdrying, cracking, shelling) within a repetition
|
||||
5. **Machine Types** → Defines available cracking machines
|
||||
6. **Cracker Parameters** → Machine-specific operational parameters (JC or Meyer)
|
||||
|
||||
### Key Relationships
|
||||
|
||||
- One Experiment Phase can have many Experiments
|
||||
- One Experiment can have many Experiment Repetitions
|
||||
- One Experiment Repetition can have multiple Phase Executions (one per phase type)
|
||||
- Phase Executions are automatically created based on the Experiment Phase template configuration
|
||||
- Cracking Phase Executions reference a Machine Type
|
||||
- Machine Types can have associated Cracker Parameters (JC or Meyer specific)
|
||||
@@ -26,9 +26,8 @@ export function DashboardLayout({ onLogout, currentRoute }: DashboardLayoutProps
|
||||
const [loading, setLoading] = useState(true)
|
||||
const [error, setError] = useState<string | null>(null)
|
||||
const [currentView, setCurrentView] = useState('dashboard')
|
||||
const [isExpanded, setIsExpanded] = useState(true)
|
||||
const [isExpanded, setIsExpanded] = useState(false)
|
||||
const [isMobileOpen, setIsMobileOpen] = useState(false)
|
||||
const [isHovered, setIsHovered] = useState(false)
|
||||
|
||||
// Valid dashboard views
|
||||
const validViews = ['dashboard', 'user-management', 'experiments', 'analytics', 'data-entry', 'vision-system', 'scheduling', 'video-library', 'profile']
|
||||
@@ -53,6 +52,26 @@ export function DashboardLayout({ onLogout, currentRoute }: DashboardLayoutProps
|
||||
}
|
||||
}
|
||||
|
||||
// Save sidebar expanded state to localStorage
|
||||
const saveSidebarState = (expanded: boolean) => {
|
||||
try {
|
||||
localStorage.setItem('sidebar-expanded', String(expanded))
|
||||
} catch (error) {
|
||||
console.warn('Failed to save sidebar state to localStorage:', error)
|
||||
}
|
||||
}
|
||||
|
||||
// Get saved sidebar state from localStorage
|
||||
const getSavedSidebarState = (): boolean => {
|
||||
try {
|
||||
const saved = localStorage.getItem('sidebar-expanded')
|
||||
return saved === 'true'
|
||||
} catch (error) {
|
||||
console.warn('Failed to get saved sidebar state from localStorage:', error)
|
||||
return false
|
||||
}
|
||||
}
|
||||
|
||||
// Check if user has access to a specific view
|
||||
const hasAccessToView = (view: string): boolean => {
|
||||
if (!user) return false
|
||||
@@ -80,6 +99,9 @@ export function DashboardLayout({ onLogout, currentRoute }: DashboardLayoutProps
|
||||
|
||||
useEffect(() => {
|
||||
fetchUserProfile()
|
||||
// Load saved sidebar state
|
||||
const savedSidebarState = getSavedSidebarState()
|
||||
setIsExpanded(savedSidebarState)
|
||||
}, [])
|
||||
|
||||
// Restore saved view when user is loaded
|
||||
@@ -144,7 +166,9 @@ export function DashboardLayout({ onLogout, currentRoute }: DashboardLayoutProps
|
||||
}
|
||||
|
||||
const toggleSidebar = () => {
|
||||
setIsExpanded(!isExpanded)
|
||||
const newState = !isExpanded
|
||||
setIsExpanded(newState)
|
||||
saveSidebarState(newState)
|
||||
}
|
||||
|
||||
const toggleMobileSidebar = () => {
|
||||
@@ -225,7 +249,7 @@ export function DashboardLayout({ onLogout, currentRoute }: DashboardLayoutProps
|
||||
)
|
||||
case 'scheduling':
|
||||
return (
|
||||
<ErrorBoundary>
|
||||
<ErrorBoundary autoRetry={true} retryDelay={2000} maxRetries={3}>
|
||||
<Suspense fallback={<div className="p-6">Loading scheduling module...</div>}>
|
||||
<RemoteScheduling user={user} currentRoute={currentRoute} />
|
||||
</Suspense>
|
||||
@@ -300,8 +324,6 @@ export function DashboardLayout({ onLogout, currentRoute }: DashboardLayoutProps
|
||||
onViewChange={handleViewChange}
|
||||
isExpanded={isExpanded}
|
||||
isMobileOpen={isMobileOpen}
|
||||
isHovered={isHovered}
|
||||
setIsHovered={setIsHovered}
|
||||
/>
|
||||
{/* Backdrop for mobile */}
|
||||
{isMobileOpen && (
|
||||
@@ -312,7 +334,7 @@ export function DashboardLayout({ onLogout, currentRoute }: DashboardLayoutProps
|
||||
)}
|
||||
</div>
|
||||
<div
|
||||
className={`flex-1 transition-all duration-300 ease-in-out bg-gray-50 dark:bg-gray-900 flex flex-col min-h-0 ${isExpanded || isHovered ? "lg:ml-[290px]" : "lg:ml-[90px]"
|
||||
className={`flex-1 transition-all duration-300 ease-in-out bg-gray-50 dark:bg-gray-900 flex flex-col min-h-0 ${isExpanded ? "lg:ml-[290px]" : "lg:ml-[90px]"
|
||||
} ${isMobileOpen ? "ml-0" : ""}`}
|
||||
>
|
||||
<TopNavbar
|
||||
|
||||
@@ -5,20 +5,61 @@ type Props = {
|
||||
fallback?: ReactNode
|
||||
onRetry?: () => void
|
||||
showRetry?: boolean
|
||||
autoRetry?: boolean
|
||||
retryDelay?: number
|
||||
maxRetries?: number
|
||||
}
|
||||
type State = { hasError: boolean }
|
||||
type State = { hasError: boolean; retryCount: number }
|
||||
|
||||
export class ErrorBoundary extends Component<Props, State> {
|
||||
state: State = { hasError: false }
|
||||
private retryTimeoutId?: NodeJS.Timeout
|
||||
|
||||
state: State = { hasError: false, retryCount: 0 }
|
||||
|
||||
static getDerivedStateFromError() {
|
||||
return { hasError: true }
|
||||
}
|
||||
|
||||
componentDidCatch() {}
|
||||
componentDidCatch(error: Error, errorInfo: React.ErrorInfo) {
|
||||
// Auto-retry logic for module federation loading issues
|
||||
const maxRetries = this.props.maxRetries || 3
|
||||
if (this.props.autoRetry !== false && this.state.retryCount < maxRetries) {
|
||||
const delay = this.props.retryDelay || 2000
|
||||
this.retryTimeoutId = setTimeout(() => {
|
||||
this.setState(prevState => ({
|
||||
hasError: false,
|
||||
retryCount: prevState.retryCount + 1
|
||||
}))
|
||||
if (this.props.onRetry) {
|
||||
this.props.onRetry()
|
||||
}
|
||||
}, delay)
|
||||
}
|
||||
}
|
||||
|
||||
componentDidUpdate(prevProps: Props, prevState: State) {
|
||||
// Reset retry count if error is cleared and component successfully rendered
|
||||
if (prevState.hasError && !this.state.hasError && this.state.retryCount > 0) {
|
||||
// Give it a moment to see if it stays error-free
|
||||
setTimeout(() => {
|
||||
if (!this.state.hasError) {
|
||||
this.setState({ retryCount: 0 })
|
||||
}
|
||||
}, 1000)
|
||||
}
|
||||
}
|
||||
|
||||
componentWillUnmount() {
|
||||
if (this.retryTimeoutId) {
|
||||
clearTimeout(this.retryTimeoutId)
|
||||
}
|
||||
}
|
||||
|
||||
handleRetry = () => {
|
||||
this.setState({ hasError: false })
|
||||
if (this.retryTimeoutId) {
|
||||
clearTimeout(this.retryTimeoutId)
|
||||
}
|
||||
this.setState({ hasError: false, retryCount: 0 })
|
||||
if (this.props.onRetry) {
|
||||
this.props.onRetry()
|
||||
}
|
||||
@@ -43,6 +84,11 @@ export class ErrorBoundary extends Component<Props, State> {
|
||||
<h3 className="text-sm font-medium text-red-800">Something went wrong loading this section</h3>
|
||||
<div className="mt-2 text-sm text-red-700">
|
||||
<p>An error occurred while loading this component. Please try reloading it.</p>
|
||||
{this.props.autoRetry !== false && this.state.retryCount < (this.props.maxRetries || 3) && (
|
||||
<p className="mt-1 text-xs text-red-600">
|
||||
Retrying automatically... (Attempt {this.state.retryCount + 1} of {(this.props.maxRetries || 3) + 1})
|
||||
</p>
|
||||
)}
|
||||
</div>
|
||||
{(this.props.showRetry !== false) && (
|
||||
<div className="mt-4">
|
||||
|
||||
@@ -3,7 +3,7 @@ import type { CreateExperimentRequest, UpdateExperimentRequest, ScheduleStatus,
|
||||
import { experimentPhaseManagement, machineTypeManagement } from '../lib/supabase'
|
||||
|
||||
interface ExperimentFormProps {
|
||||
initialData?: Partial<CreateExperimentRequest & { schedule_status: ScheduleStatus; results_status: ResultsStatus; completion_status: boolean }>
|
||||
initialData?: Partial<CreateExperimentRequest & { schedule_status: ScheduleStatus; results_status: ResultsStatus; completion_status: boolean }> & { phase_id?: string | null }
|
||||
onSubmit: (data: CreateExperimentRequest | UpdateExperimentRequest) => Promise<void>
|
||||
onCancel: () => void
|
||||
isEditing?: boolean
|
||||
@@ -12,31 +12,41 @@ interface ExperimentFormProps {
|
||||
}
|
||||
|
||||
export function ExperimentForm({ initialData, onSubmit, onCancel, isEditing = false, loading = false, phaseId }: ExperimentFormProps) {
|
||||
const [formData, setFormData] = useState<CreateExperimentRequest & { schedule_status: ScheduleStatus; results_status: ResultsStatus; completion_status: boolean }>({
|
||||
experiment_number: initialData?.experiment_number || 0,
|
||||
reps_required: initialData?.reps_required || 1,
|
||||
weight_per_repetition_lbs: (initialData as any)?.weight_per_repetition_lbs || 1,
|
||||
soaking_duration_hr: initialData?.soaking_duration_hr || 0,
|
||||
air_drying_time_min: initialData?.air_drying_time_min || 0,
|
||||
plate_contact_frequency_hz: initialData?.plate_contact_frequency_hz || 1,
|
||||
throughput_rate_pecans_sec: initialData?.throughput_rate_pecans_sec || 1,
|
||||
crush_amount_in: initialData?.crush_amount_in || 0,
|
||||
entry_exit_height_diff_in: initialData?.entry_exit_height_diff_in || 0,
|
||||
// Meyer-specific (UI only)
|
||||
motor_speed_hz: (initialData as any)?.motor_speed_hz || 1,
|
||||
jig_displacement_inches: (initialData as any)?.jig_displacement_inches || 0,
|
||||
spring_stiffness_nm: (initialData as any)?.spring_stiffness_nm || 1,
|
||||
schedule_status: initialData?.schedule_status || 'pending schedule',
|
||||
results_status: initialData?.results_status || 'valid',
|
||||
completion_status: initialData?.completion_status || false,
|
||||
phase_id: initialData?.phase_id || phaseId
|
||||
const getInitialFormState = (d: any) => ({
|
||||
experiment_number: d?.experiment_number ?? 0,
|
||||
reps_required: d?.reps_required ?? 1,
|
||||
weight_per_repetition_lbs: d?.weight_per_repetition_lbs ?? 1,
|
||||
soaking_duration_hr: d?.soaking?.soaking_duration_hr ?? d?.soaking_duration_hr ?? 0,
|
||||
air_drying_time_min: d?.airdrying?.duration_minutes ?? d?.air_drying_time_min ?? 0,
|
||||
plate_contact_frequency_hz: d?.cracking?.plate_contact_frequency_hz ?? d?.plate_contact_frequency_hz ?? 1,
|
||||
throughput_rate_pecans_sec: d?.cracking?.throughput_rate_pecans_sec ?? d?.throughput_rate_pecans_sec ?? 1,
|
||||
crush_amount_in: d?.cracking?.crush_amount_in ?? d?.crush_amount_in ?? 0,
|
||||
entry_exit_height_diff_in: d?.cracking?.entry_exit_height_diff_in ?? d?.entry_exit_height_diff_in ?? 0,
|
||||
motor_speed_hz: d?.cracking?.motor_speed_hz ?? d?.motor_speed_hz ?? 1,
|
||||
jig_displacement_inches: d?.cracking?.jig_displacement_inches ?? d?.jig_displacement_inches ?? 0,
|
||||
spring_stiffness_nm: d?.cracking?.spring_stiffness_nm ?? d?.spring_stiffness_nm ?? 1,
|
||||
schedule_status: d?.schedule_status ?? 'pending schedule',
|
||||
results_status: d?.results_status ?? 'valid',
|
||||
completion_status: d?.completion_status ?? false,
|
||||
phase_id: d?.phase_id ?? phaseId,
|
||||
ring_gap_inches: d?.shelling?.ring_gap_inches ?? d?.ring_gap_inches ?? null,
|
||||
drum_rpm: d?.shelling?.drum_rpm ?? d?.drum_rpm ?? null
|
||||
})
|
||||
|
||||
const [formData, setFormData] = useState<CreateExperimentRequest & { schedule_status: ScheduleStatus; results_status: ResultsStatus; completion_status: boolean }>(() => getInitialFormState(initialData))
|
||||
|
||||
const [errors, setErrors] = useState<Record<string, string>>({})
|
||||
const [phase, setPhase] = useState<ExperimentPhase | null>(null)
|
||||
const [crackingMachine, setCrackingMachine] = useState<MachineType | null>(null)
|
||||
const [metaLoading, setMetaLoading] = useState<boolean>(false)
|
||||
|
||||
// When initialData loads with phase config (edit mode), sync form state
|
||||
useEffect(() => {
|
||||
if ((initialData as any)?.id) {
|
||||
setFormData(prev => ({ ...prev, ...getInitialFormState(initialData) }))
|
||||
}
|
||||
}, [initialData])
|
||||
|
||||
useEffect(() => {
|
||||
const loadMeta = async () => {
|
||||
if (!phaseId) return
|
||||
@@ -76,11 +86,11 @@ export function ExperimentForm({ initialData, onSubmit, onCancel, isEditing = fa
|
||||
}
|
||||
|
||||
|
||||
if (formData.soaking_duration_hr < 0) {
|
||||
if ((formData.soaking_duration_hr ?? 0) < 0) {
|
||||
newErrors.soaking_duration_hr = 'Soaking duration cannot be negative'
|
||||
}
|
||||
|
||||
if (formData.air_drying_time_min < 0) {
|
||||
if ((formData.air_drying_time_min ?? 0) < 0) {
|
||||
newErrors.air_drying_time_min = 'Air drying time cannot be negative'
|
||||
}
|
||||
|
||||
@@ -93,7 +103,7 @@ export function ExperimentForm({ initialData, onSubmit, onCancel, isEditing = fa
|
||||
if (!formData.throughput_rate_pecans_sec || formData.throughput_rate_pecans_sec <= 0) {
|
||||
newErrors.throughput_rate_pecans_sec = 'Throughput rate must be positive'
|
||||
}
|
||||
if (formData.crush_amount_in < 0) {
|
||||
if ((formData.crush_amount_in ?? 0) < 0) {
|
||||
newErrors.crush_amount_in = 'Crush amount cannot be negative'
|
||||
}
|
||||
}
|
||||
@@ -110,6 +120,16 @@ export function ExperimentForm({ initialData, onSubmit, onCancel, isEditing = fa
|
||||
}
|
||||
}
|
||||
|
||||
// Shelling: if provided, must be positive
|
||||
if (phase?.has_shelling) {
|
||||
if (formData.ring_gap_inches != null && (typeof formData.ring_gap_inches !== 'number' || formData.ring_gap_inches <= 0)) {
|
||||
newErrors.ring_gap_inches = 'Ring gap must be positive'
|
||||
}
|
||||
if (formData.drum_rpm != null && (typeof formData.drum_rpm !== 'number' || formData.drum_rpm <= 0)) {
|
||||
newErrors.drum_rpm = 'Drum RPM must be positive'
|
||||
}
|
||||
}
|
||||
|
||||
setErrors(newErrors)
|
||||
return Object.keys(newErrors).length === 0
|
||||
}
|
||||
@@ -122,14 +142,25 @@ export function ExperimentForm({ initialData, onSubmit, onCancel, isEditing = fa
|
||||
}
|
||||
|
||||
try {
|
||||
// Prepare data for submission
|
||||
// Prepare data: include all phase params so they are stored in experiment_soaking, experiment_airdrying, experiment_cracking, experiment_shelling
|
||||
const submitData = isEditing ? formData : {
|
||||
experiment_number: formData.experiment_number,
|
||||
reps_required: formData.reps_required,
|
||||
weight_per_repetition_lbs: formData.weight_per_repetition_lbs,
|
||||
results_status: formData.results_status,
|
||||
completion_status: formData.completion_status,
|
||||
phase_id: formData.phase_id
|
||||
phase_id: formData.phase_id,
|
||||
soaking_duration_hr: formData.soaking_duration_hr,
|
||||
air_drying_time_min: formData.air_drying_time_min,
|
||||
plate_contact_frequency_hz: formData.plate_contact_frequency_hz,
|
||||
throughput_rate_pecans_sec: formData.throughput_rate_pecans_sec,
|
||||
crush_amount_in: formData.crush_amount_in,
|
||||
entry_exit_height_diff_in: formData.entry_exit_height_diff_in,
|
||||
motor_speed_hz: (formData as any).motor_speed_hz,
|
||||
jig_displacement_inches: (formData as any).jig_displacement_inches,
|
||||
spring_stiffness_nm: (formData as any).spring_stiffness_nm,
|
||||
ring_gap_inches: formData.ring_gap_inches ?? undefined,
|
||||
drum_rpm: formData.drum_rpm ?? undefined
|
||||
}
|
||||
|
||||
await onSubmit(submitData)
|
||||
@@ -138,7 +169,7 @@ export function ExperimentForm({ initialData, onSubmit, onCancel, isEditing = fa
|
||||
}
|
||||
}
|
||||
|
||||
const handleInputChange = (field: keyof typeof formData, value: string | number | boolean) => {
|
||||
const handleInputChange = (field: keyof typeof formData, value: string | number | boolean | null | undefined) => {
|
||||
setFormData(prev => ({
|
||||
...prev,
|
||||
[field]: value
|
||||
@@ -441,18 +472,40 @@ export function ExperimentForm({ initialData, onSubmit, onCancel, isEditing = fa
|
||||
<h3 className="text-lg font-medium text-gray-900 mb-4">Shelling</h3>
|
||||
<div className="grid grid-cols-1 md:grid-cols-2 gap-6">
|
||||
<div>
|
||||
<label className="block text-sm font-medium text-gray-700 mb-2">
|
||||
Shelling Start Offset (minutes)
|
||||
<label htmlFor="ring_gap_inches" className="block text-sm font-medium text-gray-700 mb-2">
|
||||
Ring gap (inches)
|
||||
</label>
|
||||
<input
|
||||
type="number"
|
||||
value={(formData as any).shelling_start_offset_min || 0}
|
||||
onChange={(e) => handleInputChange('shelling_start_offset_min' as any, parseInt(e.target.value) || 0)}
|
||||
className="max-w-xs px-3 py-2 border border-gray-300 rounded-lg focus:ring-2 focus:ring-blue-500 focus:border-blue-500 transition-colors text-sm"
|
||||
placeholder="0"
|
||||
id="ring_gap_inches"
|
||||
value={formData.ring_gap_inches ?? ''}
|
||||
onChange={(e) => handleInputChange('ring_gap_inches' as any, e.target.value === '' ? null : parseFloat(e.target.value))}
|
||||
className={`max-w-xs px-3 py-2 border rounded-lg focus:ring-2 focus:ring-blue-500 focus:border-blue-500 transition-colors text-sm ${errors.ring_gap_inches ? 'border-red-300' : 'border-gray-300'}`}
|
||||
placeholder="e.g. 0.25"
|
||||
min="0"
|
||||
step="0.01"
|
||||
/>
|
||||
{errors.ring_gap_inches && (
|
||||
<p className="mt-1 text-sm text-red-600">{errors.ring_gap_inches}</p>
|
||||
)}
|
||||
</div>
|
||||
<div>
|
||||
<label htmlFor="drum_rpm" className="block text-sm font-medium text-gray-700 mb-2">
|
||||
Drum RPM
|
||||
</label>
|
||||
<input
|
||||
type="number"
|
||||
id="drum_rpm"
|
||||
value={formData.drum_rpm ?? ''}
|
||||
onChange={(e) => handleInputChange('drum_rpm' as any, e.target.value === '' ? null : parseInt(e.target.value, 10))}
|
||||
className={`max-w-xs px-3 py-2 border rounded-lg focus:ring-2 focus:ring-blue-500 focus:border-blue-500 transition-colors text-sm ${errors.drum_rpm ? 'border-red-300' : 'border-gray-300'}`}
|
||||
placeholder="e.g. 300"
|
||||
min="1"
|
||||
step="1"
|
||||
/>
|
||||
{errors.drum_rpm && (
|
||||
<p className="mt-1 text-sm text-red-600">{errors.drum_rpm}</p>
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
import { useState } from 'react'
|
||||
import { useState, useEffect } from 'react'
|
||||
import { ExperimentForm } from './ExperimentForm'
|
||||
import { experimentManagement } from '../lib/supabase'
|
||||
import type { Experiment, CreateExperimentRequest, UpdateExperimentRequest } from '../lib/supabase'
|
||||
@@ -13,9 +13,20 @@ interface ExperimentModalProps {
|
||||
export function ExperimentModal({ experiment, onClose, onExperimentSaved, phaseId }: ExperimentModalProps) {
|
||||
const [loading, setLoading] = useState(false)
|
||||
const [error, setError] = useState<string | null>(null)
|
||||
const [initialData, setInitialData] = useState<Experiment | (Experiment & { soaking?: any; airdrying?: any; cracking?: any; shelling?: any }) | undefined>(experiment ?? undefined)
|
||||
|
||||
const isEditing = !!experiment
|
||||
|
||||
useEffect(() => {
|
||||
if (experiment) {
|
||||
experimentManagement.getExperimentWithPhaseConfig(experiment.id)
|
||||
.then((data) => setInitialData(data ?? experiment))
|
||||
.catch(() => setInitialData(experiment))
|
||||
} else {
|
||||
setInitialData(undefined)
|
||||
}
|
||||
}, [experiment?.id])
|
||||
|
||||
const handleSubmit = async (data: CreateExperimentRequest | UpdateExperimentRequest) => {
|
||||
setError(null)
|
||||
setLoading(true)
|
||||
@@ -24,22 +35,24 @@ export function ExperimentModal({ experiment, onClose, onExperimentSaved, phaseI
|
||||
let savedExperiment: Experiment
|
||||
|
||||
if (isEditing && experiment) {
|
||||
// Check if experiment number is unique (excluding current experiment)
|
||||
// Check if experiment number is unique within this phase (excluding current experiment)
|
||||
if ('experiment_number' in data && data.experiment_number !== undefined && data.experiment_number !== experiment.experiment_number) {
|
||||
const isUnique = await experimentManagement.isExperimentNumberUnique(data.experiment_number, experiment.id)
|
||||
const phaseIdToCheck = data.phase_id ?? experiment.phase_id ?? phaseId
|
||||
const isUnique = await experimentManagement.isExperimentNumberUnique(data.experiment_number, phaseIdToCheck ?? undefined, experiment.id)
|
||||
if (!isUnique) {
|
||||
setError('Experiment number already exists. Please choose a different number.')
|
||||
setError('Experiment number already exists in this phase. Please choose a different number.')
|
||||
return
|
||||
}
|
||||
}
|
||||
|
||||
savedExperiment = await experimentManagement.updateExperiment(experiment.id, data)
|
||||
} else {
|
||||
// Check if experiment number is unique for new experiments
|
||||
// Check if experiment number is unique within this phase for new experiments
|
||||
const createData = data as CreateExperimentRequest
|
||||
const isUnique = await experimentManagement.isExperimentNumberUnique(createData.experiment_number)
|
||||
const phaseIdToCheck = createData.phase_id ?? phaseId
|
||||
const isUnique = await experimentManagement.isExperimentNumberUnique(createData.experiment_number, phaseIdToCheck ?? undefined)
|
||||
if (!isUnique) {
|
||||
setError('Experiment number already exists. Please choose a different number.')
|
||||
setError('Experiment number already exists in this phase. Please choose a different number.')
|
||||
return
|
||||
}
|
||||
|
||||
@@ -115,7 +128,7 @@ export function ExperimentModal({ experiment, onClose, onExperimentSaved, phaseI
|
||||
|
||||
{/* Form */}
|
||||
<ExperimentForm
|
||||
initialData={experiment}
|
||||
initialData={initialData ? { ...initialData, phase_id: initialData.phase_id ?? undefined } : undefined}
|
||||
onSubmit={handleSubmit}
|
||||
onCancel={handleCancel}
|
||||
isEditing={isEditing}
|
||||
|
||||
@@ -31,8 +31,8 @@ export function ExperimentPhases({ onPhaseSelect }: ExperimentPhasesProps) {
|
||||
setPhases(phasesData)
|
||||
setCurrentUser(userData)
|
||||
} catch (err: any) {
|
||||
setError(err.message || 'Failed to load experiment phases')
|
||||
console.error('Load experiment phases error:', err)
|
||||
setError(err.message || 'Failed to load experiment books')
|
||||
console.error('Load experiment books error:', err)
|
||||
} finally {
|
||||
setLoading(false)
|
||||
}
|
||||
@@ -61,16 +61,16 @@ export function ExperimentPhases({ onPhaseSelect }: ExperimentPhasesProps) {
|
||||
<div className="mb-8">
|
||||
<div className="flex justify-between items-center">
|
||||
<div>
|
||||
<h1 className="text-3xl font-bold text-gray-900 dark:text-white">Experiment Phases</h1>
|
||||
<p className="mt-2 text-gray-600 dark:text-gray-400">Select an experiment phase to view and manage its experiments</p>
|
||||
<p className="mt-2 text-gray-600 dark:text-gray-400">Experiment phases help organize experiments into logical groups for easier navigation and management.</p>
|
||||
<h1 className="text-3xl font-bold text-gray-900 dark:text-white">Experiment Books</h1>
|
||||
<p className="mt-2 text-gray-600 dark:text-gray-400">Select an experiment book to view and manage its experiments</p>
|
||||
<p className="mt-2 text-gray-600 dark:text-gray-400">Experiment books help organize experiments into logical groups for easier navigation and management.</p>
|
||||
</div>
|
||||
{canManagePhases && (
|
||||
<button
|
||||
onClick={() => setShowCreateModal(true)}
|
||||
className="inline-flex items-center px-4 py-2 border border-transparent text-sm font-medium rounded-md text-white bg-blue-600 hover:bg-blue-700 focus:outline-none focus:ring-2 focus:ring-offset-2 focus:ring-blue-500"
|
||||
>
|
||||
➕ New Phase
|
||||
➕ New Book
|
||||
</button>
|
||||
)}
|
||||
</div>
|
||||
@@ -162,9 +162,9 @@ export function ExperimentPhases({ onPhaseSelect }: ExperimentPhasesProps) {
|
||||
<svg className="mx-auto h-12 w-12 text-gray-400 dark:text-gray-500" fill="none" viewBox="0 0 24 24" stroke="currentColor">
|
||||
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M19.428 15.428a2 2 0 00-1.022-.547l-2.387-.477a6 6 0 00-3.86.517l-.318.158a6 6 0 01-3.86.517L6.05 15.21a2 2 0 00-1.806.547M8 4h8l-1 1v5.172a2 2 0 00.586 1.414l5 5c1.26 1.26.367 3.414-1.415 3.414H4.828c-1.782 0-2.674-2.154-1.414-3.414l5-5A2 2 0 009 10.172V5L8 4z" />
|
||||
</svg>
|
||||
<h3 className="mt-2 text-sm font-medium text-gray-900 dark:text-white">No experiment phases found</h3>
|
||||
<h3 className="mt-2 text-sm font-medium text-gray-900 dark:text-white">No experiment books found</h3>
|
||||
<p className="mt-1 text-sm text-gray-500 dark:text-gray-400">
|
||||
Get started by creating your first experiment phase.
|
||||
Get started by creating your first experiment book.
|
||||
</p>
|
||||
{canManagePhases && (
|
||||
<div className="mt-6">
|
||||
@@ -172,7 +172,7 @@ export function ExperimentPhases({ onPhaseSelect }: ExperimentPhasesProps) {
|
||||
onClick={() => setShowCreateModal(true)}
|
||||
className="inline-flex items-center px-4 py-2 border border-transparent shadow-sm text-sm font-medium rounded-md text-white bg-blue-600 hover:bg-blue-700 focus:outline-none focus:ring-2 focus:ring-offset-2 focus:ring-blue-500"
|
||||
>
|
||||
➕ Create First Phase
|
||||
➕ Create First Book
|
||||
</button>
|
||||
</div>
|
||||
)}
|
||||
|
||||
@@ -193,7 +193,7 @@ export function PhaseExperiments({ phase, onBack }: PhaseExperimentsProps) {
|
||||
<svg className="w-5 h-5 mr-2" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M15 19l-7-7 7-7" />
|
||||
</svg>
|
||||
Back to Phases
|
||||
Back to Books
|
||||
</button>
|
||||
</div>
|
||||
|
||||
@@ -203,7 +203,7 @@ export function PhaseExperiments({ phase, onBack }: PhaseExperimentsProps) {
|
||||
{phase.description && (
|
||||
<p className="mt-2 text-gray-600">{phase.description}</p>
|
||||
)}
|
||||
<p className="mt-2 text-gray-600">Manage experiments within this phase</p>
|
||||
<p className="mt-2 text-gray-600">Manage experiments within this book</p>
|
||||
</div>
|
||||
{canManageExperiments && (
|
||||
<button
|
||||
@@ -417,9 +417,9 @@ export function PhaseExperiments({ phase, onBack }: PhaseExperimentsProps) {
|
||||
<svg className="mx-auto h-12 w-12 text-gray-400" fill="none" viewBox="0 0 24 24" stroke="currentColor">
|
||||
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M9 5H7a2 2 0 00-2 2v10a2 2 0 002 2h8a2 2 0 002-2V7a2 2 0 00-2-2H5a2 2 0 00-2 2v12a2 2 0 002 2z" />
|
||||
</svg>
|
||||
<h3 className="mt-2 text-sm font-medium text-gray-900">No experiments found in this phase</h3>
|
||||
<h3 className="mt-2 text-sm font-medium text-gray-900">No experiments found in this book</h3>
|
||||
<p className="mt-1 text-sm text-gray-500">
|
||||
Get started by creating your first experiment in this phase.
|
||||
Get started by creating your first experiment in this book.
|
||||
</p>
|
||||
{canManageExperiments && (
|
||||
<div className="mt-6">
|
||||
|
||||
@@ -147,7 +147,7 @@ export function PhaseForm({ onSubmit, onCancel, loading = false }: PhaseFormProp
|
||||
onChange={(e) => handleInputChange('description', e.target.value)}
|
||||
rows={3}
|
||||
className="w-full px-3 py-2 border border-gray-300 rounded-md shadow-sm focus:outline-none focus:ring-2 focus:ring-blue-500 focus:border-blue-500"
|
||||
placeholder="Optional description of this experiment phase"
|
||||
placeholder="Optional description of this experiment book"
|
||||
disabled={loading}
|
||||
/>
|
||||
</div>
|
||||
|
||||
@@ -21,7 +21,7 @@ export function PhaseModal({ onClose, onPhaseCreated }: PhaseModalProps) {
|
||||
onPhaseCreated(newPhase)
|
||||
onClose()
|
||||
} catch (err: any) {
|
||||
setError(err.message || 'Failed to create experiment phase')
|
||||
setError(err.message || 'Failed to create experiment book')
|
||||
console.error('Create phase error:', err)
|
||||
} finally {
|
||||
setLoading(false)
|
||||
@@ -35,7 +35,7 @@ export function PhaseModal({ onClose, onPhaseCreated }: PhaseModalProps) {
|
||||
{/* Header */}
|
||||
<div className="flex items-center justify-between mb-6">
|
||||
<h3 className="text-lg font-medium text-gray-900">
|
||||
Create New Experiment Phase
|
||||
Create New Experiment Book
|
||||
</h3>
|
||||
<button
|
||||
onClick={onClose}
|
||||
|
||||
@@ -7,8 +7,6 @@ interface SidebarProps {
|
||||
onViewChange: (view: string) => void
|
||||
isExpanded?: boolean
|
||||
isMobileOpen?: boolean
|
||||
isHovered?: boolean
|
||||
setIsHovered?: (hovered: boolean) => void
|
||||
}
|
||||
|
||||
interface MenuItem {
|
||||
@@ -23,10 +21,8 @@ export function Sidebar({
|
||||
user,
|
||||
currentView,
|
||||
onViewChange,
|
||||
isExpanded = true,
|
||||
isMobileOpen = false,
|
||||
isHovered = false,
|
||||
setIsHovered
|
||||
isExpanded = false,
|
||||
isMobileOpen = false
|
||||
}: SidebarProps) {
|
||||
const [openSubmenu, setOpenSubmenu] = useState<number | null>(null)
|
||||
const [subMenuHeight, setSubMenuHeight] = useState<Record<string, number>>({})
|
||||
@@ -170,7 +166,7 @@ export function Sidebar({
|
||||
className={`menu-item group ${openSubmenu === index
|
||||
? "menu-item-active"
|
||||
: "menu-item-inactive"
|
||||
} cursor-pointer ${!isExpanded && !isHovered
|
||||
} cursor-pointer ${!isExpanded
|
||||
? "lg:justify-center"
|
||||
: "lg:justify-start"
|
||||
}`}
|
||||
@@ -183,10 +179,10 @@ export function Sidebar({
|
||||
>
|
||||
{nav.icon}
|
||||
</span>
|
||||
{(isExpanded || isHovered || isMobileOpen) && (
|
||||
{(isExpanded || isMobileOpen) && (
|
||||
<span className="menu-item-text">{nav.name}</span>
|
||||
)}
|
||||
{(isExpanded || isHovered || isMobileOpen) && (
|
||||
{(isExpanded || isMobileOpen) && (
|
||||
<svg
|
||||
className={`ml-auto w-5 h-5 transition-transform duration-200 ${openSubmenu === index
|
||||
? "rotate-180 text-brand-500"
|
||||
@@ -214,12 +210,12 @@ export function Sidebar({
|
||||
>
|
||||
{nav.icon}
|
||||
</span>
|
||||
{(isExpanded || isHovered || isMobileOpen) && (
|
||||
{(isExpanded || isMobileOpen) && (
|
||||
<span className="menu-item-text">{nav.name}</span>
|
||||
)}
|
||||
</button>
|
||||
)}
|
||||
{nav.subItems && (isExpanded || isHovered || isMobileOpen) && (
|
||||
{nav.subItems && (isExpanded || isMobileOpen) && (
|
||||
<div
|
||||
ref={(el) => {
|
||||
subMenuRefs.current[`submenu-${index}`] = el
|
||||
@@ -265,21 +261,17 @@ export function Sidebar({
|
||||
className={`fixed mt-16 flex flex-col lg:mt-0 top-0 px-5 left-0 bg-white dark:bg-gray-900 dark:border-gray-800 text-gray-900 h-screen transition-all duration-300 ease-in-out z-50 border-r border-gray-200
|
||||
${isExpanded || isMobileOpen
|
||||
? "w-[290px]"
|
||||
: isHovered
|
||||
? "w-[290px]"
|
||||
: "w-[90px]"
|
||||
: "w-[90px]"
|
||||
}
|
||||
${isMobileOpen ? "translate-x-0" : "-translate-x-full"}
|
||||
lg:translate-x-0`}
|
||||
onMouseEnter={() => !isExpanded && setIsHovered && setIsHovered(true)}
|
||||
onMouseLeave={() => setIsHovered && setIsHovered(false)}
|
||||
>
|
||||
<div
|
||||
className={`py-8 flex ${!isExpanded && !isHovered ? "lg:justify-center" : "justify-start"
|
||||
className={`py-8 flex ${!isExpanded ? "lg:justify-center" : "justify-start"
|
||||
}`}
|
||||
>
|
||||
<div>
|
||||
{isExpanded || isHovered || isMobileOpen ? (
|
||||
{isExpanded || isMobileOpen ? (
|
||||
<>
|
||||
<h1 className="text-xl font-bold text-gray-800 dark:text-white/90">Pecan Experiments</h1>
|
||||
<p className="text-sm text-gray-500 dark:text-gray-400">Research Dashboard</p>
|
||||
@@ -297,12 +289,12 @@ export function Sidebar({
|
||||
<div className="flex flex-col gap-4">
|
||||
<div>
|
||||
<h2
|
||||
className={`mb-4 text-xs uppercase flex leading-[20px] text-gray-400 ${!isExpanded && !isHovered
|
||||
className={`mb-4 text-xs uppercase flex leading-[20px] text-gray-400 ${!isExpanded
|
||||
? "lg:justify-center"
|
||||
: "justify-start"
|
||||
}`}
|
||||
>
|
||||
{isExpanded || isHovered || isMobileOpen ? (
|
||||
{isExpanded || isMobileOpen ? (
|
||||
"Menu"
|
||||
) : (
|
||||
<svg className="size-6" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
|
||||
@@ -60,6 +60,8 @@ export interface Experiment {
|
||||
airdrying_id?: string | null
|
||||
cracking_id?: string | null
|
||||
shelling_id?: string | null
|
||||
ring_gap_inches?: number | null
|
||||
drum_rpm?: number | null
|
||||
created_at: string
|
||||
updated_at: string
|
||||
created_by: string
|
||||
@@ -170,6 +172,51 @@ export interface UpdateExperimentPhaseRequest {
|
||||
has_shelling?: boolean
|
||||
}
|
||||
|
||||
// Experiment-level phase config (one row per experiment per phase; stored in experiment_soaking, experiment_airdrying, experiment_cracking, experiment_shelling)
|
||||
export interface ExperimentSoakingConfig {
|
||||
id: string
|
||||
experiment_id: string
|
||||
soaking_duration_hr: number
|
||||
created_at: string
|
||||
updated_at: string
|
||||
created_by: string
|
||||
}
|
||||
|
||||
export interface ExperimentAirdryingConfig {
|
||||
id: string
|
||||
experiment_id: string
|
||||
duration_minutes: number
|
||||
created_at: string
|
||||
updated_at: string
|
||||
created_by: string
|
||||
}
|
||||
|
||||
export interface ExperimentCrackingConfig {
|
||||
id: string
|
||||
experiment_id: string
|
||||
machine_type_id: string
|
||||
plate_contact_frequency_hz?: number | null
|
||||
throughput_rate_pecans_sec?: number | null
|
||||
crush_amount_in?: number | null
|
||||
entry_exit_height_diff_in?: number | null
|
||||
motor_speed_hz?: number | null
|
||||
jig_displacement_inches?: number | null
|
||||
spring_stiffness_nm?: number | null
|
||||
created_at: string
|
||||
updated_at: string
|
||||
created_by: string
|
||||
}
|
||||
|
||||
export interface ExperimentShellingConfig {
|
||||
id: string
|
||||
experiment_id: string
|
||||
ring_gap_inches?: number | null
|
||||
drum_rpm?: number | null
|
||||
created_at: string
|
||||
updated_at: string
|
||||
created_by: string
|
||||
}
|
||||
|
||||
export interface CreateExperimentRequest {
|
||||
experiment_number: number
|
||||
reps_required: number
|
||||
@@ -177,6 +224,19 @@ export interface CreateExperimentRequest {
|
||||
results_status?: ResultsStatus
|
||||
completion_status?: boolean
|
||||
phase_id?: string
|
||||
// Phase config (stored in experiment_soaking, experiment_airdrying, experiment_cracking, experiment_shelling)
|
||||
soaking_duration_hr?: number
|
||||
air_drying_time_min?: number
|
||||
// Cracking: machine_type comes from book; params below are JC or Meyer specific
|
||||
plate_contact_frequency_hz?: number
|
||||
throughput_rate_pecans_sec?: number
|
||||
crush_amount_in?: number
|
||||
entry_exit_height_diff_in?: number
|
||||
motor_speed_hz?: number
|
||||
jig_displacement_inches?: number
|
||||
spring_stiffness_nm?: number
|
||||
ring_gap_inches?: number | null
|
||||
drum_rpm?: number | null
|
||||
}
|
||||
|
||||
export interface UpdateExperimentRequest {
|
||||
@@ -186,6 +246,17 @@ export interface UpdateExperimentRequest {
|
||||
results_status?: ResultsStatus
|
||||
completion_status?: boolean
|
||||
phase_id?: string
|
||||
soaking_duration_hr?: number
|
||||
air_drying_time_min?: number
|
||||
plate_contact_frequency_hz?: number
|
||||
throughput_rate_pecans_sec?: number
|
||||
crush_amount_in?: number
|
||||
entry_exit_height_diff_in?: number
|
||||
motor_speed_hz?: number
|
||||
jig_displacement_inches?: number
|
||||
spring_stiffness_nm?: number
|
||||
ring_gap_inches?: number | null
|
||||
drum_rpm?: number | null
|
||||
}
|
||||
|
||||
export interface CreateRepetitionRequest {
|
||||
@@ -614,12 +685,12 @@ export const userManagement = {
|
||||
}
|
||||
}
|
||||
|
||||
// Experiment phase management utility functions
|
||||
// Experiment book management (table: experiment_books)
|
||||
export const experimentPhaseManagement = {
|
||||
// Get all experiment phases
|
||||
// Get all experiment books
|
||||
async getAllExperimentPhases(): Promise<ExperimentPhase[]> {
|
||||
const { data, error } = await supabase
|
||||
.from('experiment_phases')
|
||||
.from('experiment_books')
|
||||
.select('*')
|
||||
.order('created_at', { ascending: false })
|
||||
|
||||
@@ -627,10 +698,10 @@ export const experimentPhaseManagement = {
|
||||
return data
|
||||
},
|
||||
|
||||
// Get experiment phase by ID
|
||||
// Get experiment book by ID
|
||||
async getExperimentPhaseById(id: string): Promise<ExperimentPhase | null> {
|
||||
const { data, error } = await supabase
|
||||
.from('experiment_phases')
|
||||
.from('experiment_books')
|
||||
.select('*')
|
||||
.eq('id', id)
|
||||
.single()
|
||||
@@ -642,13 +713,13 @@ export const experimentPhaseManagement = {
|
||||
return data
|
||||
},
|
||||
|
||||
// Create a new experiment phase
|
||||
// Create a new experiment book
|
||||
async createExperimentPhase(phaseData: CreateExperimentPhaseRequest): Promise<ExperimentPhase> {
|
||||
const { data: { user }, error: authError } = await supabase.auth.getUser()
|
||||
if (authError || !user) throw new Error('User not authenticated')
|
||||
|
||||
const { data, error } = await supabase
|
||||
.from('experiment_phases')
|
||||
.from('experiment_books')
|
||||
.insert({
|
||||
...phaseData,
|
||||
created_by: user.id
|
||||
@@ -660,10 +731,10 @@ export const experimentPhaseManagement = {
|
||||
return data
|
||||
},
|
||||
|
||||
// Update an experiment phase
|
||||
// Update an experiment book
|
||||
async updateExperimentPhase(id: string, updates: UpdateExperimentPhaseRequest): Promise<ExperimentPhase> {
|
||||
const { data, error } = await supabase
|
||||
.from('experiment_phases')
|
||||
.from('experiment_books')
|
||||
.update(updates)
|
||||
.eq('id', id)
|
||||
.select()
|
||||
@@ -673,10 +744,10 @@ export const experimentPhaseManagement = {
|
||||
return data
|
||||
},
|
||||
|
||||
// Delete an experiment phase
|
||||
// Delete an experiment book
|
||||
async deleteExperimentPhase(id: string): Promise<void> {
|
||||
const { error } = await supabase
|
||||
.from('experiment_phases')
|
||||
.from('experiment_books')
|
||||
.delete()
|
||||
.eq('id', id)
|
||||
|
||||
@@ -724,33 +795,170 @@ export const experimentManagement = {
|
||||
return data
|
||||
},
|
||||
|
||||
// Create a new experiment
|
||||
// Get experiment with its phase config (soaking, airdrying, cracking, shelling) for edit form
|
||||
async getExperimentWithPhaseConfig(id: string): Promise<(Experiment & {
|
||||
soaking?: ExperimentSoakingConfig | null
|
||||
airdrying?: ExperimentAirdryingConfig | null
|
||||
cracking?: ExperimentCrackingConfig | null
|
||||
shelling?: ExperimentShellingConfig | null
|
||||
}) | null> {
|
||||
const experiment = await this.getExperimentById(id)
|
||||
if (!experiment) return null
|
||||
|
||||
const [soakingRes, airdryingRes, crackingRes, shellingRes] = await Promise.all([
|
||||
supabase.from('experiment_soaking').select('*').eq('experiment_id', id).maybeSingle(),
|
||||
supabase.from('experiment_airdrying').select('*').eq('experiment_id', id).maybeSingle(),
|
||||
supabase.from('experiment_cracking').select('*').eq('experiment_id', id).maybeSingle(),
|
||||
supabase.from('experiment_shelling').select('*').eq('experiment_id', id).maybeSingle()
|
||||
])
|
||||
if (soakingRes.error) throw soakingRes.error
|
||||
if (airdryingRes.error) throw airdryingRes.error
|
||||
if (crackingRes.error) throw crackingRes.error
|
||||
if (shellingRes.error) throw shellingRes.error
|
||||
|
||||
return {
|
||||
...experiment,
|
||||
soaking: soakingRes.data ?? null,
|
||||
airdrying: airdryingRes.data ?? null,
|
||||
cracking: crackingRes.data ?? null,
|
||||
shelling: shellingRes.data ?? null
|
||||
}
|
||||
},
|
||||
|
||||
// Create a new experiment and its phase config rows (experiment_soaking, experiment_airdrying, experiment_cracking, experiment_shelling)
|
||||
async createExperiment(experimentData: CreateExperimentRequest): Promise<Experiment> {
|
||||
const { data: { user }, error: authError } = await supabase.auth.getUser()
|
||||
if (authError || !user) throw new Error('User not authenticated')
|
||||
|
||||
const { data, error } = await supabase
|
||||
const phaseId = experimentData.phase_id
|
||||
const corePayload = {
|
||||
experiment_number: experimentData.experiment_number,
|
||||
reps_required: experimentData.reps_required,
|
||||
weight_per_repetition_lbs: experimentData.weight_per_repetition_lbs,
|
||||
results_status: experimentData.results_status ?? 'valid',
|
||||
completion_status: experimentData.completion_status ?? false,
|
||||
phase_id: phaseId,
|
||||
created_by: user.id
|
||||
}
|
||||
// phase_id required for phase configs
|
||||
if (!phaseId) {
|
||||
const { data, error } = await supabase.from('experiments').insert(corePayload).select().single()
|
||||
if (error) throw error
|
||||
return data
|
||||
}
|
||||
|
||||
const { data: experiment, error } = await supabase
|
||||
.from('experiments')
|
||||
.insert({
|
||||
...experimentData,
|
||||
created_by: user.id
|
||||
})
|
||||
.insert(corePayload)
|
||||
.select()
|
||||
.single()
|
||||
|
||||
if (error) throw error
|
||||
return data
|
||||
|
||||
const book = await experimentPhaseManagement.getExperimentPhaseById(phaseId)
|
||||
if (!book) return experiment
|
||||
|
||||
if (book.has_soaking && experimentData.soaking_duration_hr != null) {
|
||||
await supabase.from('experiment_soaking').insert({
|
||||
experiment_id: experiment.id,
|
||||
soaking_duration_hr: experimentData.soaking_duration_hr,
|
||||
created_by: user.id
|
||||
})
|
||||
}
|
||||
if (book.has_airdrying && experimentData.air_drying_time_min != null) {
|
||||
await supabase.from('experiment_airdrying').insert({
|
||||
experiment_id: experiment.id,
|
||||
duration_minutes: experimentData.air_drying_time_min,
|
||||
created_by: user.id
|
||||
})
|
||||
}
|
||||
if (book.has_cracking && book.cracking_machine_type_id) {
|
||||
const crackPayload: Record<string, unknown> = {
|
||||
experiment_id: experiment.id,
|
||||
machine_type_id: book.cracking_machine_type_id,
|
||||
created_by: user.id
|
||||
}
|
||||
if (experimentData.plate_contact_frequency_hz != null) crackPayload.plate_contact_frequency_hz = experimentData.plate_contact_frequency_hz
|
||||
if (experimentData.throughput_rate_pecans_sec != null) crackPayload.throughput_rate_pecans_sec = experimentData.throughput_rate_pecans_sec
|
||||
if (experimentData.crush_amount_in != null) crackPayload.crush_amount_in = experimentData.crush_amount_in
|
||||
if (experimentData.entry_exit_height_diff_in != null) crackPayload.entry_exit_height_diff_in = experimentData.entry_exit_height_diff_in
|
||||
if (experimentData.motor_speed_hz != null) crackPayload.motor_speed_hz = experimentData.motor_speed_hz
|
||||
if (experimentData.jig_displacement_inches != null) crackPayload.jig_displacement_inches = experimentData.jig_displacement_inches
|
||||
if (experimentData.spring_stiffness_nm != null) crackPayload.spring_stiffness_nm = experimentData.spring_stiffness_nm
|
||||
await supabase.from('experiment_cracking').insert(crackPayload)
|
||||
}
|
||||
if (book.has_shelling && (experimentData.ring_gap_inches != null || experimentData.drum_rpm != null)) {
|
||||
await supabase.from('experiment_shelling').insert({
|
||||
experiment_id: experiment.id,
|
||||
ring_gap_inches: experimentData.ring_gap_inches ?? null,
|
||||
drum_rpm: experimentData.drum_rpm ?? null,
|
||||
created_by: user.id
|
||||
})
|
||||
}
|
||||
|
||||
return experiment
|
||||
},
|
||||
|
||||
// Update an experiment
|
||||
// Update an experiment and upsert its phase config rows
|
||||
async updateExperiment(id: string, updates: UpdateExperimentRequest): Promise<Experiment> {
|
||||
const { data, error } = await supabase
|
||||
.from('experiments')
|
||||
.update(updates)
|
||||
.eq('id', id)
|
||||
.select()
|
||||
.single()
|
||||
const { data: { user }, error: authError } = await supabase.auth.getUser()
|
||||
if (authError || !user) throw new Error('User not authenticated')
|
||||
|
||||
const coreKeys = ['experiment_number', 'reps_required', 'weight_per_repetition_lbs', 'results_status', 'completion_status', 'phase_id'] as const
|
||||
const coreUpdates: Partial<UpdateExperimentRequest> = {}
|
||||
for (const k of coreKeys) {
|
||||
if (updates[k] !== undefined) coreUpdates[k] = updates[k]
|
||||
}
|
||||
if (Object.keys(coreUpdates).length > 0) {
|
||||
const { data, error } = await supabase.from('experiments').update(coreUpdates).eq('id', id).select().single()
|
||||
if (error) throw error
|
||||
}
|
||||
|
||||
if (updates.soaking_duration_hr !== undefined) {
|
||||
const { data: existing } = await supabase.from('experiment_soaking').select('id').eq('experiment_id', id).maybeSingle()
|
||||
if (existing) {
|
||||
await supabase.from('experiment_soaking').update({ soaking_duration_hr: updates.soaking_duration_hr, updated_at: new Date().toISOString() }).eq('experiment_id', id)
|
||||
} else {
|
||||
await supabase.from('experiment_soaking').insert({ experiment_id: id, soaking_duration_hr: updates.soaking_duration_hr, created_by: user.id })
|
||||
}
|
||||
}
|
||||
if (updates.air_drying_time_min !== undefined) {
|
||||
const { data: existing } = await supabase.from('experiment_airdrying').select('id').eq('experiment_id', id).maybeSingle()
|
||||
if (existing) {
|
||||
await supabase.from('experiment_airdrying').update({ duration_minutes: updates.air_drying_time_min, updated_at: new Date().toISOString() }).eq('experiment_id', id)
|
||||
} else {
|
||||
await supabase.from('experiment_airdrying').insert({ experiment_id: id, duration_minutes: updates.air_drying_time_min, created_by: user.id })
|
||||
}
|
||||
}
|
||||
const crackKeys = ['plate_contact_frequency_hz', 'throughput_rate_pecans_sec', 'crush_amount_in', 'entry_exit_height_diff_in', 'motor_speed_hz', 'jig_displacement_inches', 'spring_stiffness_nm'] as const
|
||||
const hasCrackUpdates = crackKeys.some(k => updates[k] !== undefined)
|
||||
if (hasCrackUpdates) {
|
||||
const { data: existing } = await supabase.from('experiment_cracking').select('id').eq('experiment_id', id).maybeSingle()
|
||||
const crackPayload: Record<string, unknown> = {}
|
||||
crackKeys.forEach(k => { if (updates[k] !== undefined) crackPayload[k] = updates[k] })
|
||||
if (Object.keys(crackPayload).length > 0) {
|
||||
if (existing) {
|
||||
await supabase.from('experiment_cracking').update({ ...crackPayload, updated_at: new Date().toISOString() }).eq('experiment_id', id)
|
||||
} else {
|
||||
const exp = await this.getExperimentById(id)
|
||||
const book = exp?.phase_id ? await experimentPhaseManagement.getExperimentPhaseById(exp.phase_id) : null
|
||||
if (book?.has_cracking && book.cracking_machine_type_id) {
|
||||
await supabase.from('experiment_cracking').insert({ experiment_id: id, machine_type_id: book.cracking_machine_type_id, ...crackPayload, created_by: user.id })
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
if (updates.ring_gap_inches !== undefined || updates.drum_rpm !== undefined) {
|
||||
const { data: existing } = await supabase.from('experiment_shelling').select('id').eq('experiment_id', id).maybeSingle()
|
||||
const shellPayload = { ring_gap_inches: updates.ring_gap_inches ?? null, drum_rpm: updates.drum_rpm ?? null }
|
||||
if (existing) {
|
||||
await supabase.from('experiment_shelling').update({ ...shellPayload, updated_at: new Date().toISOString() }).eq('experiment_id', id)
|
||||
} else {
|
||||
await supabase.from('experiment_shelling').insert({ experiment_id: id, ...shellPayload, created_by: user.id })
|
||||
}
|
||||
}
|
||||
|
||||
const { data, error } = await supabase.from('experiments').select('*').eq('id', id).single()
|
||||
if (error) throw error
|
||||
return data
|
||||
},
|
||||
@@ -793,13 +1001,16 @@ export const experimentManagement = {
|
||||
|
||||
|
||||
|
||||
// Check if experiment number is unique
|
||||
async isExperimentNumberUnique(experimentNumber: number, excludeId?: string): Promise<boolean> {
|
||||
// Check if experiment number is unique within the same phase (experiment_number + phase_id must be unique)
|
||||
async isExperimentNumberUnique(experimentNumber: number, phaseId?: string, excludeId?: string): Promise<boolean> {
|
||||
let query = supabase
|
||||
.from('experiments')
|
||||
.select('id')
|
||||
.eq('experiment_number', experimentNumber)
|
||||
|
||||
if (phaseId) {
|
||||
query = query.eq('phase_id', phaseId)
|
||||
}
|
||||
if (excludeId) {
|
||||
query = query.neq('id', excludeId)
|
||||
}
|
||||
|
||||
@@ -9,7 +9,7 @@
|
||||
"build:watch": "vite build --watch",
|
||||
"serve:dist": "serve -s dist -l 3003",
|
||||
"preview": "vite preview --port 3003",
|
||||
"dev:watch": "npm run build && (npm run build:watch &) && sleep 1 && npx http-server dist -p 3003 --cors -c-1"
|
||||
"dev:watch": "./wait-and-serve.sh"
|
||||
},
|
||||
"dependencies": {
|
||||
"@supabase/supabase-js": "^2.52.0",
|
||||
|
||||
File diff suppressed because it is too large
Load Diff
@@ -70,8 +70,6 @@ export function ScheduleExperiment({ user, onBack }: { user: User; onBack: () =>
|
||||
|
||||
// Track repetitions that have been dropped/moved and should show time points
|
||||
const [repetitionsWithTimes, setRepetitionsWithTimes] = useState<Set<string>>(new Set())
|
||||
// Track which repetitions are locked (prevent dragging)
|
||||
const [lockedSchedules, setLockedSchedules] = useState<Set<string>>(new Set())
|
||||
// Track which repetitions are currently being scheduled
|
||||
const [schedulingRepetitions, setSchedulingRepetitions] = useState<Set<string>>(new Set())
|
||||
// Track conductor assignments for each phase marker (markerId -> conductorIds[])
|
||||
@@ -253,44 +251,22 @@ export function ScheduleExperiment({ user, onBack }: { user: User; onBack: () =>
|
||||
}
|
||||
|
||||
const toggleRepetition = (repId: string) => {
|
||||
// Checking/unchecking should only control visibility on the timeline.
|
||||
// It must NOT clear scheduling info or conductor assignments.
|
||||
setSelectedRepetitionIds(prev => {
|
||||
const next = new Set(prev)
|
||||
if (next.has(repId)) {
|
||||
// Hide this repetition from the timeline
|
||||
next.delete(repId)
|
||||
// Remove from scheduled repetitions when unchecked
|
||||
setScheduledRepetitions(prevScheduled => {
|
||||
const newScheduled = { ...prevScheduled }
|
||||
delete newScheduled[repId]
|
||||
return newScheduled
|
||||
})
|
||||
// Clear all related state when unchecked
|
||||
setRepetitionsWithTimes(prev => {
|
||||
const next = new Set(prev)
|
||||
next.delete(repId)
|
||||
return next
|
||||
})
|
||||
setLockedSchedules(prev => {
|
||||
const next = new Set(prev)
|
||||
next.delete(repId)
|
||||
return next
|
||||
})
|
||||
setSchedulingRepetitions(prev => {
|
||||
const next = new Set(prev)
|
||||
next.delete(repId)
|
||||
return next
|
||||
})
|
||||
// Re-stagger remaining repetitions
|
||||
const remainingIds = Array.from(next).filter(id => id !== repId)
|
||||
if (remainingIds.length > 0) {
|
||||
reStaggerRepetitions(remainingIds)
|
||||
}
|
||||
// Keep scheduledRepetitions and repetitionsWithTimes intact so that
|
||||
// re-checking the box restores the repetition in the correct spot.
|
||||
} else {
|
||||
// Show this repetition on the timeline
|
||||
next.add(repId)
|
||||
// Auto-spawn when checked - pass the updated set to ensure correct stagger calculation
|
||||
// spawnSingleRepetition will position the new repetition relative to existing ones
|
||||
// without resetting existing positions
|
||||
spawnSingleRepetition(repId, next)
|
||||
// Re-stagger all existing repetitions to prevent overlap
|
||||
// Note: reStaggerRepetitions will automatically skip locked repetitions
|
||||
reStaggerRepetitions([...next, repId])
|
||||
}
|
||||
return next
|
||||
})
|
||||
@@ -305,20 +281,14 @@ export function ScheduleExperiment({ user, onBack }: { user: User; onBack: () =>
|
||||
const allSelected = allRepetitions.every(rep => prev.has(rep.id))
|
||||
|
||||
if (allSelected) {
|
||||
// Deselect all repetitions in this phase
|
||||
// Deselect all repetitions in this phase (hide from timeline only)
|
||||
const next = new Set(prev)
|
||||
allRepetitions.forEach(rep => {
|
||||
next.delete(rep.id)
|
||||
// Remove from scheduled repetitions
|
||||
setScheduledRepetitions(prevScheduled => {
|
||||
const newScheduled = { ...prevScheduled }
|
||||
delete newScheduled[rep.id]
|
||||
return newScheduled
|
||||
})
|
||||
})
|
||||
return next
|
||||
} else {
|
||||
// Select all repetitions in this phase
|
||||
// Select all repetitions in this phase (show on timeline)
|
||||
const next = new Set(prev)
|
||||
allRepetitions.forEach(rep => {
|
||||
next.add(rep.id)
|
||||
@@ -356,7 +326,7 @@ export function ScheduleExperiment({ user, onBack }: { user: User; onBack: () =>
|
||||
|
||||
// Re-stagger all repetitions to prevent overlap
|
||||
// IMPORTANT: Skip locked repetitions to prevent them from moving
|
||||
const reStaggerRepetitions = useCallback((repIds: string[]) => {
|
||||
const reStaggerRepetitions = useCallback((repIds: string[], onlyResetWithoutCustomTimes: boolean = false) => {
|
||||
const tomorrow = new Date()
|
||||
tomorrow.setDate(tomorrow.getDate() + 1)
|
||||
tomorrow.setHours(9, 0, 0, 0)
|
||||
@@ -364,8 +334,11 @@ export function ScheduleExperiment({ user, onBack }: { user: User; onBack: () =>
|
||||
setScheduledRepetitions(prev => {
|
||||
const newScheduled = { ...prev }
|
||||
|
||||
// Filter out locked repetitions - they should not be moved
|
||||
const unlockedRepIds = repIds.filter(repId => !lockedSchedules.has(repId))
|
||||
// If onlyResetWithoutCustomTimes is true, filter out repetitions that have custom times set
|
||||
let unlockedRepIds = repIds
|
||||
if (onlyResetWithoutCustomTimes) {
|
||||
unlockedRepIds = unlockedRepIds.filter(repId => !repetitionsWithTimes.has(repId))
|
||||
}
|
||||
|
||||
// Calculate stagger index only for unlocked repetitions
|
||||
let staggerIndex = 0
|
||||
@@ -407,7 +380,7 @@ export function ScheduleExperiment({ user, onBack }: { user: User; onBack: () =>
|
||||
|
||||
return newScheduled
|
||||
})
|
||||
}, [lockedSchedules, repetitionsByExperiment, experimentsByPhase, soakingByExperiment, airdryingByExperiment])
|
||||
}, [repetitionsByExperiment, experimentsByPhase, soakingByExperiment, airdryingByExperiment, repetitionsWithTimes])
|
||||
|
||||
// Spawn a single repetition in calendar
|
||||
const spawnSingleRepetition = (repId: string, updatedSelectedIds?: Set<string>) => {
|
||||
@@ -477,10 +450,11 @@ export function ScheduleExperiment({ user, onBack }: { user: User; onBack: () =>
|
||||
let newScheduled = { ...prev }
|
||||
|
||||
const clampToReasonableHours = (d: Date) => {
|
||||
// Allow full 24 hours (midnight to midnight)
|
||||
const min = new Date(d)
|
||||
min.setHours(5, 0, 0, 0)
|
||||
min.setHours(0, 0, 0, 0)
|
||||
const max = new Date(d)
|
||||
max.setHours(23, 0, 0, 0)
|
||||
max.setHours(23, 59, 59, 999)
|
||||
const t = d.getTime()
|
||||
return new Date(Math.min(Math.max(t, min.getTime()), max.getTime()))
|
||||
}
|
||||
@@ -536,13 +510,10 @@ export function ScheduleExperiment({ user, onBack }: { user: User; onBack: () =>
|
||||
const repetition = repetitionsByExperiment[scheduled.experimentId]?.find(r => r.id === scheduled.repetitionId)
|
||||
|
||||
if (experiment && repetition && scheduled.soakingStart) {
|
||||
const isLocked = lockedSchedules.has(scheduled.repetitionId)
|
||||
const lockIcon = isLocked ? '🔒' : '🔓'
|
||||
|
||||
// Soaking marker
|
||||
events.push({
|
||||
id: `${scheduled.repetitionId}-soaking`,
|
||||
title: `${lockIcon} 💧 Soaking - Exp ${experiment.experiment_number} Rep ${repetition.repetition_number}`,
|
||||
title: `💧 Soaking - Exp ${experiment.experiment_number} Rep ${repetition.repetition_number}`,
|
||||
start: scheduled.soakingStart,
|
||||
end: new Date(scheduled.soakingStart.getTime() + 15 * 60000), // 15 minute duration for better visibility
|
||||
resource: 'soaking'
|
||||
@@ -552,7 +523,7 @@ export function ScheduleExperiment({ user, onBack }: { user: User; onBack: () =>
|
||||
if (scheduled.airdryingStart) {
|
||||
events.push({
|
||||
id: `${scheduled.repetitionId}-airdrying`,
|
||||
title: `${lockIcon} 🌬️ Airdrying - Exp ${experiment.experiment_number} Rep ${repetition.repetition_number}`,
|
||||
title: `🌬️ Airdrying - Exp ${experiment.experiment_number} Rep ${repetition.repetition_number}`,
|
||||
start: scheduled.airdryingStart,
|
||||
end: new Date(scheduled.airdryingStart.getTime() + 15 * 60000), // 15 minute duration for better visibility
|
||||
resource: 'airdrying'
|
||||
@@ -563,7 +534,7 @@ export function ScheduleExperiment({ user, onBack }: { user: User; onBack: () =>
|
||||
if (scheduled.crackingStart) {
|
||||
events.push({
|
||||
id: `${scheduled.repetitionId}-cracking`,
|
||||
title: `${lockIcon} ⚡ Cracking - Exp ${experiment.experiment_number} Rep ${repetition.repetition_number}`,
|
||||
title: `⚡ Cracking - Exp ${experiment.experiment_number} Rep ${repetition.repetition_number}`,
|
||||
start: scheduled.crackingStart,
|
||||
end: new Date(scheduled.crackingStart.getTime() + 15 * 60000), // 15 minute duration for better visibility
|
||||
resource: 'cracking'
|
||||
@@ -573,7 +544,7 @@ export function ScheduleExperiment({ user, onBack }: { user: User; onBack: () =>
|
||||
})
|
||||
|
||||
return events
|
||||
}, [scheduledRepetitions, experimentsByPhase, repetitionsByExperiment, lockedSchedules])
|
||||
}, [scheduledRepetitions, experimentsByPhase, repetitionsByExperiment])
|
||||
|
||||
// Memoize the calendar events
|
||||
const calendarEvents = useMemo(() => {
|
||||
@@ -609,15 +580,16 @@ export function ScheduleExperiment({ user, onBack }: { user: User; onBack: () =>
|
||||
return moment(date).format('MMM D, h:mm A')
|
||||
}
|
||||
|
||||
const toggleScheduleLock = (repId: string) => {
|
||||
setLockedSchedules(prev => {
|
||||
const next = new Set(prev)
|
||||
if (next.has(repId)) {
|
||||
next.delete(repId)
|
||||
} else {
|
||||
next.add(repId)
|
||||
}
|
||||
return next
|
||||
// Remove all conductor assignments from a repetition
|
||||
const removeRepetitionAssignments = (repId: string) => {
|
||||
const markerIdPrefix = repId
|
||||
setConductorAssignments(prev => {
|
||||
const newAssignments = { ...prev }
|
||||
// Remove assignments for all three phases
|
||||
delete newAssignments[`${markerIdPrefix}-soaking`]
|
||||
delete newAssignments[`${markerIdPrefix}-airdrying`]
|
||||
delete newAssignments[`${markerIdPrefix}-cracking`]
|
||||
return newAssignments
|
||||
})
|
||||
}
|
||||
|
||||
@@ -625,24 +597,16 @@ export function ScheduleExperiment({ user, onBack }: { user: User; onBack: () =>
|
||||
// Only make repetition markers draggable, not availability events
|
||||
const resource = event.resource as string
|
||||
if (resource === 'soaking' || resource === 'airdrying' || resource === 'cracking') {
|
||||
// Check if the repetition is locked
|
||||
const eventId = event.id as string
|
||||
const repId = eventId.split('-')[0]
|
||||
const isLocked = lockedSchedules.has(repId)
|
||||
return !isLocked
|
||||
return true
|
||||
}
|
||||
return false
|
||||
}, [lockedSchedules])
|
||||
}, [])
|
||||
|
||||
const eventPropGetter = useCallback((event: any) => {
|
||||
const resource = event.resource as string
|
||||
|
||||
// Styling for repetition markers (foreground events)
|
||||
if (resource === 'soaking' || resource === 'airdrying' || resource === 'cracking') {
|
||||
const eventId = event.id as string
|
||||
const repId = eventId.split('-')[0]
|
||||
const isLocked = lockedSchedules.has(repId)
|
||||
|
||||
const colors = {
|
||||
soaking: '#3b82f6', // blue
|
||||
airdrying: '#10b981', // green
|
||||
@@ -652,8 +616,8 @@ export function ScheduleExperiment({ user, onBack }: { user: User; onBack: () =>
|
||||
|
||||
return {
|
||||
style: {
|
||||
backgroundColor: isLocked ? '#9ca3af' : color, // gray if locked
|
||||
borderColor: isLocked ? color : color, // border takes original color when locked
|
||||
backgroundColor: color,
|
||||
borderColor: color,
|
||||
color: 'white',
|
||||
borderRadius: '8px',
|
||||
border: '2px solid',
|
||||
@@ -674,17 +638,17 @@ export function ScheduleExperiment({ user, onBack }: { user: User; onBack: () =>
|
||||
overflow: 'hidden',
|
||||
textOverflow: 'ellipsis',
|
||||
whiteSpace: 'nowrap',
|
||||
cursor: isLocked ? 'not-allowed' : 'grab',
|
||||
boxShadow: isLocked ? '0 1px 2px rgba(0,0,0,0.1)' : '0 2px 4px rgba(0,0,0,0.2)',
|
||||
cursor: 'grab',
|
||||
boxShadow: '0 2px 4px rgba(0,0,0,0.2)',
|
||||
transition: 'all 0.2s ease',
|
||||
opacity: isLocked ? 0.7 : 1
|
||||
opacity: 1
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Default styling for other events
|
||||
return {}
|
||||
}, [lockedSchedules])
|
||||
}, [])
|
||||
|
||||
const scheduleRepetition = async (repId: string, experimentId: string) => {
|
||||
setSchedulingRepetitions(prev => new Set(prev).add(repId))
|
||||
@@ -756,6 +720,51 @@ export function ScheduleExperiment({ user, onBack }: { user: User; onBack: () =>
|
||||
}
|
||||
}
|
||||
|
||||
// Unschedule a repetition: clear its scheduling info and unassign all conductors.
|
||||
const unscheduleRepetition = async (repId: string, experimentId: string) => {
|
||||
setSchedulingRepetitions(prev => new Set(prev).add(repId))
|
||||
|
||||
try {
|
||||
// Remove all conductor assignments for this repetition
|
||||
removeRepetitionAssignments(repId)
|
||||
|
||||
// Clear scheduled_date on the repetition in local state
|
||||
setRepetitionsByExperiment(prev => ({
|
||||
...prev,
|
||||
[experimentId]: prev[experimentId]?.map(r =>
|
||||
r.id === repId ? { ...r, scheduled_date: null } : r
|
||||
) || []
|
||||
}))
|
||||
|
||||
// Clear scheduled times for this repetition so it disappears from the timeline
|
||||
setScheduledRepetitions(prev => {
|
||||
const next = { ...prev }
|
||||
delete next[repId]
|
||||
return next
|
||||
})
|
||||
|
||||
// This repetition no longer has active times
|
||||
setRepetitionsWithTimes(prev => {
|
||||
const next = new Set(prev)
|
||||
next.delete(repId)
|
||||
return next
|
||||
})
|
||||
|
||||
// Also clear scheduled_date in the database for this repetition
|
||||
await repetitionManagement.updateRepetition(repId, {
|
||||
scheduled_date: null
|
||||
})
|
||||
} catch (error: any) {
|
||||
setError(error?.message || 'Failed to unschedule repetition')
|
||||
} finally {
|
||||
setSchedulingRepetitions(prev => {
|
||||
const next = new Set(prev)
|
||||
next.delete(repId)
|
||||
return next
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
// Restore scroll position after scheduledRepetitions changes
|
||||
useEffect(() => {
|
||||
if (scrollPositionRef.current) {
|
||||
@@ -806,11 +815,15 @@ export function ScheduleExperiment({ user, onBack }: { user: User; onBack: () =>
|
||||
phase: 'soaking' | 'airdrying' | 'cracking'
|
||||
startTime: Date
|
||||
assignedConductors: string[]
|
||||
locked: boolean
|
||||
}> = []
|
||||
|
||||
Object.values(scheduledRepetitions).forEach(scheduled => {
|
||||
const repId = scheduled.repetitionId
|
||||
// Only include markers for repetitions that are checked (selected)
|
||||
if (!selectedRepetitionIds.has(repId)) {
|
||||
return
|
||||
}
|
||||
|
||||
const markerIdPrefix = repId
|
||||
|
||||
if (scheduled.soakingStart) {
|
||||
@@ -820,8 +833,7 @@ export function ScheduleExperiment({ user, onBack }: { user: User; onBack: () =>
|
||||
experimentId: scheduled.experimentId,
|
||||
phase: 'soaking',
|
||||
startTime: scheduled.soakingStart,
|
||||
assignedConductors: conductorAssignments[`${markerIdPrefix}-soaking`] || [],
|
||||
locked: lockedSchedules.has(repId)
|
||||
assignedConductors: conductorAssignments[`${markerIdPrefix}-soaking`] || []
|
||||
})
|
||||
}
|
||||
|
||||
@@ -832,8 +844,7 @@ export function ScheduleExperiment({ user, onBack }: { user: User; onBack: () =>
|
||||
experimentId: scheduled.experimentId,
|
||||
phase: 'airdrying',
|
||||
startTime: scheduled.airdryingStart,
|
||||
assignedConductors: conductorAssignments[`${markerIdPrefix}-airdrying`] || [],
|
||||
locked: lockedSchedules.has(repId)
|
||||
assignedConductors: conductorAssignments[`${markerIdPrefix}-airdrying`] || []
|
||||
})
|
||||
}
|
||||
|
||||
@@ -844,8 +855,7 @@ export function ScheduleExperiment({ user, onBack }: { user: User; onBack: () =>
|
||||
experimentId: scheduled.experimentId,
|
||||
phase: 'cracking',
|
||||
startTime: scheduled.crackingStart,
|
||||
assignedConductors: conductorAssignments[`${markerIdPrefix}-cracking`] || [],
|
||||
locked: lockedSchedules.has(repId)
|
||||
assignedConductors: conductorAssignments[`${markerIdPrefix}-cracking`] || []
|
||||
})
|
||||
}
|
||||
})
|
||||
@@ -856,7 +866,66 @@ export function ScheduleExperiment({ user, onBack }: { user: User; onBack: () =>
|
||||
conductorAvailabilities,
|
||||
phaseMarkers
|
||||
}
|
||||
}, [selectedConductorIds, conductors, conductorColorMap, colorPalette, availabilityEvents, scheduledRepetitions, conductorAssignments, lockedSchedules, calendarStartDate, calendarZoom])
|
||||
}, [selectedConductorIds, conductors, conductorColorMap, colorPalette, availabilityEvents, scheduledRepetitions, conductorAssignments, calendarStartDate, calendarZoom, selectedRepetitionIds])
|
||||
|
||||
// Build repetition metadata mapping for timeline display
|
||||
const repetitionMetadata = useMemo(() => {
|
||||
const metadata: Record<string, { phaseName: string; experimentNumber: number; repetitionNumber: number; experimentId: string; isScheduledInDb: boolean }> = {}
|
||||
|
||||
Object.values(scheduledRepetitions).forEach(scheduled => {
|
||||
const repId = scheduled.repetitionId
|
||||
// Only include metadata for repetitions that are checked (selected)
|
||||
if (!selectedRepetitionIds.has(repId)) {
|
||||
return
|
||||
}
|
||||
|
||||
const experiment = Object.values(experimentsByPhase).flat().find(e => e.id === scheduled.experimentId)
|
||||
const repetition = Object.values(repetitionsByExperiment).flat().find(r => r.id === repId)
|
||||
const phase = phases.find(p =>
|
||||
Object.values(experimentsByPhase[p.id] || []).some(e => e.id === scheduled.experimentId)
|
||||
)
|
||||
|
||||
if (experiment && repetition && phase) {
|
||||
metadata[repId] = {
|
||||
phaseName: phase.name,
|
||||
experimentNumber: experiment.experiment_number,
|
||||
repetitionNumber: repetition.repetition_number,
|
||||
experimentId: scheduled.experimentId,
|
||||
// Consider a repetition \"scheduled\" in DB if it has a non-null scheduled_date
|
||||
isScheduledInDb: Boolean(repetition.scheduled_date)
|
||||
}
|
||||
}
|
||||
})
|
||||
|
||||
return metadata
|
||||
}, [scheduledRepetitions, experimentsByPhase, repetitionsByExperiment, phases, selectedRepetitionIds])
|
||||
|
||||
// Scroll to repetition in accordion
|
||||
const handleScrollToRepetition = useCallback(async (repetitionId: string) => {
|
||||
// First, expand the phase if it's collapsed
|
||||
const repetition = Object.values(repetitionsByExperiment).flat().find(r => r.id === repetitionId)
|
||||
if (repetition) {
|
||||
const experiment = Object.values(experimentsByPhase).flat().find(e =>
|
||||
(repetitionsByExperiment[e.id] || []).some(r => r.id === repetitionId)
|
||||
)
|
||||
if (experiment) {
|
||||
const phase = phases.find(p =>
|
||||
(experimentsByPhase[p.id] || []).some(e => e.id === experiment.id)
|
||||
)
|
||||
if (phase && !expandedPhaseIds.has(phase.id)) {
|
||||
await togglePhaseExpand(phase.id)
|
||||
// Wait a bit for the accordion to expand
|
||||
await new Promise(resolve => setTimeout(resolve, 300))
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Then scroll to the element
|
||||
const element = document.getElementById(`repetition-${repetitionId}`)
|
||||
if (element) {
|
||||
element.scrollIntoView({ behavior: 'smooth', block: 'center' })
|
||||
}
|
||||
}, [repetitionsByExperiment, experimentsByPhase, phases, expandedPhaseIds, togglePhaseExpand])
|
||||
|
||||
// Handlers for horizontal calendar
|
||||
const handleHorizontalMarkerDrag = useCallback((markerId: string, newTime: Date) => {
|
||||
@@ -878,21 +947,6 @@ export function ScheduleExperiment({ user, onBack }: { user: User; onBack: () =>
|
||||
}))
|
||||
}, [])
|
||||
|
||||
const handleHorizontalMarkerLockToggle = useCallback((markerId: string) => {
|
||||
// Marker ID format: ${repId}-${phase} where repId is a UUID with hyphens
|
||||
// Split by '-' and take all but the last segment as repId
|
||||
const parts = markerId.split('-')
|
||||
const repId = parts.slice(0, -1).join('-')
|
||||
setLockedSchedules(prev => {
|
||||
const next = new Set(prev)
|
||||
if (next.has(repId)) {
|
||||
next.delete(repId)
|
||||
} else {
|
||||
next.add(repId)
|
||||
}
|
||||
return next
|
||||
})
|
||||
}, [])
|
||||
|
||||
|
||||
return (
|
||||
@@ -1027,7 +1081,9 @@ export function ScheduleExperiment({ user, onBack }: { user: User; onBack: () =>
|
||||
phaseMarkers={horizontalCalendarData.phaseMarkers}
|
||||
onMarkerDrag={handleHorizontalMarkerDrag}
|
||||
onMarkerAssignConductors={handleHorizontalMarkerAssignConductors}
|
||||
onMarkerLockToggle={handleHorizontalMarkerLockToggle}
|
||||
repetitionMetadata={repetitionMetadata}
|
||||
onScrollToRepetition={handleScrollToRepetition}
|
||||
onScheduleRepetition={scheduleRepetition}
|
||||
timeStep={15}
|
||||
minHour={6}
|
||||
maxHour={22}
|
||||
@@ -1196,11 +1252,21 @@ export function ScheduleExperiment({ user, onBack }: { user: User; onBack: () =>
|
||||
const checked = selectedRepetitionIds.has(rep.id)
|
||||
const hasTimes = repetitionsWithTimes.has(rep.id)
|
||||
const scheduled = scheduledRepetitions[rep.id]
|
||||
const isLocked = lockedSchedules.has(rep.id)
|
||||
const isScheduling = schedulingRepetitions.has(rep.id)
|
||||
|
||||
// Check if there are any conductor assignments
|
||||
const markerIdPrefix = rep.id
|
||||
const soakingConductors = conductorAssignments[`${markerIdPrefix}-soaking`] || []
|
||||
const airdryingConductors = conductorAssignments[`${markerIdPrefix}-airdrying`] || []
|
||||
const crackingConductors = conductorAssignments[`${markerIdPrefix}-cracking`] || []
|
||||
const hasAssignments = soakingConductors.length > 0 || airdryingConductors.length > 0 || crackingConductors.length > 0
|
||||
|
||||
return (
|
||||
<div key={rep.id} className="bg-white dark:bg-gray-900 border border-gray-200 dark:border-gray-700 rounded p-3">
|
||||
<div
|
||||
key={rep.id}
|
||||
id={`repetition-${rep.id}`}
|
||||
className="bg-white dark:bg-gray-900 border border-gray-200 dark:border-gray-700 rounded p-3"
|
||||
>
|
||||
{/* Checkbox row */}
|
||||
<label className="flex items-center gap-2">
|
||||
<input
|
||||
@@ -1212,44 +1278,89 @@ export function ScheduleExperiment({ user, onBack }: { user: User; onBack: () =>
|
||||
<span className="text-sm font-medium text-gray-700 dark:text-gray-300">Rep {rep.repetition_number}</span>
|
||||
</label>
|
||||
|
||||
{/* Time points (shown only if has been dropped/moved) */}
|
||||
{hasTimes && scheduled && (
|
||||
{/* Time points (shown whenever the repetition has scheduled times) */}
|
||||
{scheduled && (
|
||||
<div className="mt-2 ml-6 text-xs space-y-1">
|
||||
<div className="flex items-center gap-2">
|
||||
<span>💧</span>
|
||||
<span>Soaking: {formatTime(scheduled.soakingStart)}</span>
|
||||
</div>
|
||||
<div className="flex items-center gap-2">
|
||||
<span>🌬️</span>
|
||||
<span>Airdrying: {formatTime(scheduled.airdryingStart)}</span>
|
||||
</div>
|
||||
<div className="flex items-center gap-2">
|
||||
<span>⚡</span>
|
||||
<span>Cracking: {formatTime(scheduled.crackingStart)}</span>
|
||||
</div>
|
||||
{(() => {
|
||||
const repId = rep.id
|
||||
const markerIdPrefix = repId
|
||||
|
||||
// Get assigned conductors for each phase
|
||||
const soakingConductors = conductorAssignments[`${markerIdPrefix}-soaking`] || []
|
||||
const airdryingConductors = conductorAssignments[`${markerIdPrefix}-airdrying`] || []
|
||||
const crackingConductors = conductorAssignments[`${markerIdPrefix}-cracking`] || []
|
||||
|
||||
// Helper to get conductor names
|
||||
const getConductorNames = (conductorIds: string[]) => {
|
||||
return conductorIds.map(id => {
|
||||
const conductor = conductors.find(c => c.id === id)
|
||||
if (!conductor) return null
|
||||
return [conductor.first_name, conductor.last_name].filter(Boolean).join(' ') || conductor.email
|
||||
}).filter(Boolean).join(', ')
|
||||
}
|
||||
|
||||
return (
|
||||
<>
|
||||
<div className="flex items-center gap-2 flex-wrap">
|
||||
<span>💧</span>
|
||||
<span>Soaking: {formatTime(scheduled.soakingStart)}</span>
|
||||
{soakingConductors.length > 0 && (
|
||||
<span className="text-blue-600 dark:text-blue-400">
|
||||
({getConductorNames(soakingConductors)})
|
||||
</span>
|
||||
)}
|
||||
</div>
|
||||
<div className="flex items-center gap-2 flex-wrap">
|
||||
<span>🌬️</span>
|
||||
<span>Airdrying: {formatTime(scheduled.airdryingStart)}</span>
|
||||
{airdryingConductors.length > 0 && (
|
||||
<span className="text-green-600 dark:text-green-400">
|
||||
({getConductorNames(airdryingConductors)})
|
||||
</span>
|
||||
)}
|
||||
</div>
|
||||
<div className="flex items-center gap-2 flex-wrap">
|
||||
<span>⚡</span>
|
||||
<span>Cracking: {formatTime(scheduled.crackingStart)}</span>
|
||||
{crackingConductors.length > 0 && (
|
||||
<span className="text-orange-600 dark:text-orange-400">
|
||||
({getConductorNames(crackingConductors)})
|
||||
</span>
|
||||
)}
|
||||
</div>
|
||||
</>
|
||||
)
|
||||
})()}
|
||||
|
||||
{/* Lock checkbox and Schedule button */}
|
||||
{/* Remove Assignments button and Schedule/Unschedule button */}
|
||||
<div className="flex items-center gap-3 mt-3 pt-2 border-t border-gray-200 dark:border-gray-600">
|
||||
<label className="flex items-center gap-2">
|
||||
<input
|
||||
type="checkbox"
|
||||
className="h-3 w-3 text-blue-600 border-gray-300 rounded"
|
||||
checked={isLocked}
|
||||
onChange={() => {
|
||||
toggleScheduleLock(rep.id)
|
||||
}}
|
||||
/>
|
||||
<span className="text-xs text-gray-600 dark:text-gray-400">
|
||||
{isLocked ? '🔒 Locked' : '🔓 Unlocked'}
|
||||
</span>
|
||||
</label>
|
||||
<button
|
||||
onClick={() => scheduleRepetition(rep.id, exp.id)}
|
||||
disabled={isScheduling || !isLocked}
|
||||
className="px-3 py-1 bg-green-600 hover:bg-green-700 disabled:bg-gray-400 text-white rounded text-xs transition-colors"
|
||||
>
|
||||
{isScheduling ? 'Scheduling...' : 'Schedule'}
|
||||
</button>
|
||||
{hasAssignments && (
|
||||
<button
|
||||
onClick={() => removeRepetitionAssignments(rep.id)}
|
||||
disabled={Boolean(rep.scheduled_date)}
|
||||
className="px-3 py-1 bg-red-600 hover:bg-red-700 disabled:bg-gray-400 text-white rounded text-xs transition-colors"
|
||||
title={rep.scheduled_date ? "Unschedule the repetition first before removing assignments" : "Remove all conductor assignments from this repetition"}
|
||||
>
|
||||
Remove Assignments
|
||||
</button>
|
||||
)}
|
||||
{rep.scheduled_date ? (
|
||||
<button
|
||||
onClick={() => unscheduleRepetition(rep.id, exp.id)}
|
||||
disabled={isScheduling}
|
||||
className="px-3 py-1 bg-yellow-600 hover:bg-yellow-700 disabled:bg-gray-400 text-white rounded text-xs transition-colors"
|
||||
>
|
||||
{isScheduling ? 'Unscheduling...' : 'Unschedule'}
|
||||
</button>
|
||||
) : (
|
||||
<button
|
||||
onClick={() => scheduleRepetition(rep.id, exp.id)}
|
||||
disabled={isScheduling}
|
||||
className="px-3 py-1 bg-green-600 hover:bg-green-700 disabled:bg-gray-400 text-white rounded text-xs transition-colors"
|
||||
>
|
||||
{isScheduling ? 'Scheduling...' : 'Schedule'}
|
||||
</button>
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
|
||||
57
scheduling-remote/wait-and-serve.sh
Normal file
57
scheduling-remote/wait-and-serve.sh
Normal file
@@ -0,0 +1,57 @@
|
||||
#!/bin/sh
|
||||
|
||||
# Build the project first
|
||||
echo "Building scheduling-remote..."
|
||||
npm run build
|
||||
|
||||
# Verify the initial build created remoteEntry.js
|
||||
REMOTE_ENTRY_PATH="dist/assets/remoteEntry.js"
|
||||
if [ ! -f "$REMOTE_ENTRY_PATH" ]; then
|
||||
echo "ERROR: Initial build did not create remoteEntry.js!"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
echo "Initial build complete. remoteEntry.js exists."
|
||||
|
||||
# Start build:watch in the background
|
||||
echo "Starting build:watch in background..."
|
||||
npm run build:watch &
|
||||
BUILD_WATCH_PID=$!
|
||||
|
||||
# Wait a moment for build:watch to start and potentially rebuild
|
||||
echo "Waiting for build:watch to stabilize..."
|
||||
sleep 3
|
||||
|
||||
# Verify remoteEntry.js still exists (build:watch might have rebuilt it)
|
||||
MAX_WAIT=30
|
||||
WAIT_COUNT=0
|
||||
while [ ! -f "$REMOTE_ENTRY_PATH" ] && [ $WAIT_COUNT -lt $MAX_WAIT ]; do
|
||||
sleep 1
|
||||
WAIT_COUNT=$((WAIT_COUNT + 1))
|
||||
if [ $((WAIT_COUNT % 5)) -eq 0 ]; then
|
||||
echo "Waiting for remoteEntry.js after build:watch... (${WAIT_COUNT}s)"
|
||||
fi
|
||||
done
|
||||
|
||||
if [ ! -f "$REMOTE_ENTRY_PATH" ]; then
|
||||
echo "ERROR: remoteEntry.js was not available after ${MAX_WAIT} seconds!"
|
||||
kill $BUILD_WATCH_PID 2>/dev/null || true
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Wait a bit more to ensure build:watch has finished any initial rebuild
|
||||
echo "Ensuring build:watch has completed initial build..."
|
||||
sleep 2
|
||||
|
||||
# Check file size to ensure it's not empty or being written
|
||||
FILE_SIZE=$(stat -f%z "$REMOTE_ENTRY_PATH" 2>/dev/null || stat -c%s "$REMOTE_ENTRY_PATH" 2>/dev/null || echo "0")
|
||||
if [ "$FILE_SIZE" -lt 100 ]; then
|
||||
echo "WARNING: remoteEntry.js seems too small (${FILE_SIZE} bytes), waiting a bit more..."
|
||||
sleep 2
|
||||
fi
|
||||
|
||||
echo "remoteEntry.js is ready (${FILE_SIZE} bytes). Starting http-server..."
|
||||
|
||||
# Start http-server and give it time to fully initialize
|
||||
# Use a simple approach: start server and wait a moment for it to be ready
|
||||
exec npx http-server dist -p 3003 --cors -c-1
|
||||
8
supabase/.gitignore
vendored
Normal file
8
supabase/.gitignore
vendored
Normal file
@@ -0,0 +1,8 @@
|
||||
# Supabase
|
||||
.branches
|
||||
.temp
|
||||
|
||||
# dotenvx
|
||||
.env.keys
|
||||
.env.local
|
||||
.env.*.local
|
||||
@@ -64,9 +64,9 @@ supabase gen types typescript --local > management-dashboard-web-app/src/types/s
|
||||
|
||||
## Seed Data
|
||||
|
||||
Seed files are run automatically after migrations when using docker-compose. They populate the database with initial data:
|
||||
- `seed_01_users.sql`: Creates admin user and initial user profiles
|
||||
- `seed_02_phase2_experiments.sql`: Creates initial experiment data
|
||||
Seed files are run automatically after migrations when using `supabase db reset` (see `config.toml` → `[db.seed]` → `sql_paths`). Currently only user seed is enabled:
|
||||
- `seed_01_users.sql`: Creates admin user and initial user profiles (enabled)
|
||||
- `seed_02_phase2_experiments.sql`: Initial experiment data (temporarily disabled; add back to `sql_paths` in `config.toml` to re-enable)
|
||||
|
||||
## Configuration
|
||||
|
||||
|
||||
14
supabase/archive/test_migration.sql
Normal file
14
supabase/archive/test_migration.sql
Normal file
@@ -0,0 +1,14 @@
|
||||
-- Test migration to create experiment_phases table
|
||||
CREATE TABLE IF NOT EXISTS public.experiment_phases (
|
||||
id UUID PRIMARY KEY DEFAULT uuid_generate_v4(),
|
||||
name TEXT NOT NULL UNIQUE,
|
||||
description TEXT,
|
||||
created_at TIMESTAMP WITH TIME ZONE DEFAULT NOW(),
|
||||
updated_at TIMESTAMP WITH TIME ZONE DEFAULT NOW(),
|
||||
created_by UUID NOT NULL
|
||||
);
|
||||
|
||||
-- Insert test data
|
||||
INSERT INTO public.experiment_phases (name, description, created_by)
|
||||
VALUES ('Phase 2 of JC Experiments', 'Second phase of JC Cracker experiments', '00000000-0000-0000-0000-000000000000')
|
||||
ON CONFLICT (name) DO NOTHING;
|
||||
@@ -57,7 +57,9 @@ schema_paths = []
|
||||
enabled = true
|
||||
# Specifies an ordered list of seed files to load during db reset.
|
||||
# Supports glob patterns relative to supabase directory: "./seeds/*.sql"
|
||||
sql_paths = ["./seed_01_users.sql", "./seed_02_phase2_experiments.sql"]
|
||||
# Temporarily only user seed; other seeds suppressed.
|
||||
sql_paths = ["./seed_01_users.sql"]
|
||||
# sql_paths = ["./seed_01_users.sql", "./seed_02_phase2_experiments.sql"]
|
||||
# , "./seed_04_phase2_jc_experiments.sql", "./seed_05_meyer_experiments.sql"]
|
||||
|
||||
[db.network_restrictions]
|
||||
|
||||
@@ -70,6 +70,10 @@ CREATE TABLE IF NOT EXISTS public.shelling (
|
||||
scheduled_start_time TIMESTAMP WITH TIME ZONE NOT NULL,
|
||||
actual_start_time TIMESTAMP WITH TIME ZONE,
|
||||
actual_end_time TIMESTAMP WITH TIME ZONE,
|
||||
-- The space (in inches) between the sheller's rings
|
||||
ring_gap_inches NUMERIC(6,2) CHECK (ring_gap_inches > 0),
|
||||
-- The revolutions per minute for the sheller drum
|
||||
drum_rpm INTEGER CHECK (drum_rpm > 0),
|
||||
created_at TIMESTAMP WITH TIME ZONE DEFAULT NOW(),
|
||||
updated_at TIMESTAMP WITH TIME ZONE DEFAULT NOW(),
|
||||
created_by UUID NOT NULL REFERENCES public.user_profiles(id),
|
||||
|
||||
@@ -56,6 +56,80 @@ CREATE INDEX IF NOT EXISTS idx_phase_executions_machine_type_id
|
||||
CREATE INDEX IF NOT EXISTS idx_phase_executions_created_by
|
||||
ON public.experiment_phase_executions(created_by);
|
||||
|
||||
-- =============================================
|
||||
-- 2.5. CREATE CONDUCTOR ASSIGNMENTS TABLE
|
||||
-- =============================================
|
||||
|
||||
-- Table to store conductor assignments to phase executions
|
||||
-- This allows multiple conductors to be assigned to each phase execution
|
||||
CREATE TABLE IF NOT EXISTS public.experiment_phase_assignments (
|
||||
id UUID PRIMARY KEY DEFAULT uuid_generate_v4(),
|
||||
phase_execution_id UUID NOT NULL REFERENCES public.experiment_phase_executions(id) ON DELETE CASCADE,
|
||||
conductor_id UUID NOT NULL REFERENCES public.user_profiles(id) ON DELETE CASCADE,
|
||||
|
||||
-- Scheduled times for this assignment (should match phase_execution times, but stored for clarity)
|
||||
scheduled_start_time TIMESTAMP WITH TIME ZONE NOT NULL,
|
||||
scheduled_end_time TIMESTAMP WITH TIME ZONE,
|
||||
|
||||
-- Status tracking
|
||||
status TEXT NOT NULL DEFAULT 'scheduled'
|
||||
CHECK (status IN ('scheduled', 'in_progress', 'completed', 'cancelled')),
|
||||
|
||||
-- Optional notes about the assignment
|
||||
notes TEXT,
|
||||
|
||||
created_at TIMESTAMP WITH TIME ZONE DEFAULT NOW(),
|
||||
updated_at TIMESTAMP WITH TIME ZONE DEFAULT NOW(),
|
||||
created_by UUID NOT NULL REFERENCES public.user_profiles(id),
|
||||
|
||||
-- Ensure scheduled_end_time is after scheduled_start_time
|
||||
CONSTRAINT valid_scheduled_time_range CHECK (scheduled_end_time IS NULL OR scheduled_end_time > scheduled_start_time),
|
||||
|
||||
-- Ensure unique assignment per conductor per phase execution
|
||||
CONSTRAINT unique_conductor_phase_execution UNIQUE (phase_execution_id, conductor_id)
|
||||
);
|
||||
|
||||
-- Indexes for conductor assignments
|
||||
CREATE INDEX IF NOT EXISTS idx_phase_assignments_phase_execution_id
|
||||
ON public.experiment_phase_assignments(phase_execution_id);
|
||||
CREATE INDEX IF NOT EXISTS idx_phase_assignments_conductor_id
|
||||
ON public.experiment_phase_assignments(conductor_id);
|
||||
CREATE INDEX IF NOT EXISTS idx_phase_assignments_status
|
||||
ON public.experiment_phase_assignments(status);
|
||||
CREATE INDEX IF NOT EXISTS idx_phase_assignments_scheduled_start_time
|
||||
ON public.experiment_phase_assignments(scheduled_start_time);
|
||||
CREATE INDEX IF NOT EXISTS idx_phase_assignments_created_by
|
||||
ON public.experiment_phase_assignments(created_by);
|
||||
|
||||
-- Trigger for updated_at on conductor assignments
|
||||
CREATE TRIGGER set_updated_at_phase_assignments
|
||||
BEFORE UPDATE ON public.experiment_phase_assignments
|
||||
FOR EACH ROW
|
||||
EXECUTE FUNCTION public.handle_updated_at();
|
||||
|
||||
-- Grant permissions
|
||||
GRANT ALL ON public.experiment_phase_assignments TO authenticated;
|
||||
|
||||
-- Enable Row Level Security
|
||||
ALTER TABLE public.experiment_phase_assignments ENABLE ROW LEVEL SECURITY;
|
||||
|
||||
-- RLS Policies for conductor assignments
|
||||
CREATE POLICY "Phase assignments are viewable by authenticated users"
|
||||
ON public.experiment_phase_assignments
|
||||
FOR SELECT USING (auth.role() = 'authenticated');
|
||||
|
||||
CREATE POLICY "Phase assignments are insertable by authenticated users"
|
||||
ON public.experiment_phase_assignments
|
||||
FOR INSERT WITH CHECK (auth.role() = 'authenticated');
|
||||
|
||||
CREATE POLICY "Phase assignments are updatable by authenticated users"
|
||||
ON public.experiment_phase_assignments
|
||||
FOR UPDATE USING (auth.role() = 'authenticated');
|
||||
|
||||
CREATE POLICY "Phase assignments are deletable by authenticated users"
|
||||
ON public.experiment_phase_assignments
|
||||
FOR DELETE USING (auth.role() = 'authenticated');
|
||||
|
||||
-- =============================================
|
||||
-- 3. FUNCTION: Calculate Sequential Phase Start Times
|
||||
-- =============================================
|
||||
|
||||
9
supabase/migrations/00015_experiment_shelling_params.sql
Normal file
9
supabase/migrations/00015_experiment_shelling_params.sql
Normal file
@@ -0,0 +1,9 @@
|
||||
-- Add experiment-level shelling parameters (defaults for repetitions)
|
||||
-- These match the shelling table attributes: ring_gap_inches, drum_rpm
|
||||
|
||||
ALTER TABLE public.experiments
|
||||
ADD COLUMN IF NOT EXISTS ring_gap_inches NUMERIC(6,2) CHECK (ring_gap_inches IS NULL OR ring_gap_inches > 0),
|
||||
ADD COLUMN IF NOT EXISTS drum_rpm INTEGER CHECK (drum_rpm IS NULL OR drum_rpm > 0);
|
||||
|
||||
COMMENT ON COLUMN public.experiments.ring_gap_inches IS 'Default space (inches) between sheller rings for this experiment';
|
||||
COMMENT ON COLUMN public.experiments.drum_rpm IS 'Default sheller drum revolutions per minute for this experiment';
|
||||
@@ -0,0 +1,399 @@
|
||||
-- Rename table experiment_phases to experiment_books
|
||||
-- This migration renames the table and updates all dependent objects (views, functions, triggers, indexes, RLS).
|
||||
|
||||
-- =============================================
|
||||
-- 1. RENAME TABLE
|
||||
-- =============================================
|
||||
|
||||
ALTER TABLE public.experiment_phases RENAME TO experiment_books;
|
||||
|
||||
-- =============================================
|
||||
-- 2. RENAME TRIGGER
|
||||
-- =============================================
|
||||
|
||||
DROP TRIGGER IF EXISTS set_updated_at_experiment_phases ON public.experiment_books;
|
||||
CREATE TRIGGER set_updated_at_experiment_books
|
||||
BEFORE UPDATE ON public.experiment_books
|
||||
FOR EACH ROW
|
||||
EXECUTE FUNCTION public.handle_updated_at();
|
||||
|
||||
-- =============================================
|
||||
-- 3. RENAME CONSTRAINT
|
||||
-- =============================================
|
||||
|
||||
ALTER TABLE public.experiment_books
|
||||
RENAME CONSTRAINT ck_experiment_phases_machine_required_when_cracking
|
||||
TO ck_experiment_books_machine_required_when_cracking;
|
||||
|
||||
-- =============================================
|
||||
-- 4. RENAME INDEXES
|
||||
-- =============================================
|
||||
|
||||
ALTER INDEX IF EXISTS public.idx_experiment_phases_name RENAME TO idx_experiment_books_name;
|
||||
ALTER INDEX IF EXISTS public.idx_experiment_phases_cracking_machine_type_id RENAME TO idx_experiment_books_cracking_machine_type_id;
|
||||
|
||||
-- =============================================
|
||||
-- 5. RLS POLICIES (drop old, create new with updated names)
|
||||
-- =============================================
|
||||
|
||||
DROP POLICY IF EXISTS "Experiment phases are viewable by authenticated users" ON public.experiment_books;
|
||||
DROP POLICY IF EXISTS "Experiment phases are insertable by authenticated users" ON public.experiment_books;
|
||||
DROP POLICY IF EXISTS "Experiment phases are updatable by authenticated users" ON public.experiment_books;
|
||||
DROP POLICY IF EXISTS "Experiment phases are deletable by authenticated users" ON public.experiment_books;
|
||||
|
||||
CREATE POLICY "Experiment books are viewable by authenticated users" ON public.experiment_books
|
||||
FOR SELECT USING (auth.role() = 'authenticated');
|
||||
|
||||
CREATE POLICY "Experiment books are insertable by authenticated users" ON public.experiment_books
|
||||
FOR INSERT WITH CHECK (auth.role() = 'authenticated');
|
||||
|
||||
CREATE POLICY "Experiment books are updatable by authenticated users" ON public.experiment_books
|
||||
FOR UPDATE USING (auth.role() = 'authenticated');
|
||||
|
||||
CREATE POLICY "Experiment books are deletable by authenticated users" ON public.experiment_books
|
||||
FOR DELETE USING (auth.role() = 'authenticated');
|
||||
|
||||
-- =============================================
|
||||
-- 6. UPDATE FUNCTION: create_phase_executions_for_repetition (references experiment_phases)
|
||||
-- =============================================
|
||||
|
||||
CREATE OR REPLACE FUNCTION create_phase_executions_for_repetition()
|
||||
RETURNS TRIGGER AS $$
|
||||
DECLARE
|
||||
exp_phase_config RECORD;
|
||||
phase_type_list TEXT[] := ARRAY[]::TEXT[];
|
||||
phase_name TEXT;
|
||||
BEGIN
|
||||
SELECT
|
||||
ep.has_soaking,
|
||||
ep.has_airdrying,
|
||||
ep.has_cracking,
|
||||
ep.has_shelling,
|
||||
ep.cracking_machine_type_id
|
||||
INTO exp_phase_config
|
||||
FROM public.experiments e
|
||||
JOIN public.experiment_books ep ON e.phase_id = ep.id
|
||||
WHERE e.id = NEW.experiment_id;
|
||||
|
||||
IF exp_phase_config.has_soaking THEN
|
||||
phase_type_list := array_append(phase_type_list, 'soaking');
|
||||
END IF;
|
||||
IF exp_phase_config.has_airdrying THEN
|
||||
phase_type_list := array_append(phase_type_list, 'airdrying');
|
||||
END IF;
|
||||
IF exp_phase_config.has_cracking THEN
|
||||
phase_type_list := array_append(phase_type_list, 'cracking');
|
||||
END IF;
|
||||
IF exp_phase_config.has_shelling THEN
|
||||
phase_type_list := array_append(phase_type_list, 'shelling');
|
||||
END IF;
|
||||
|
||||
FOREACH phase_name IN ARRAY phase_type_list
|
||||
LOOP
|
||||
INSERT INTO public.experiment_phase_executions (
|
||||
repetition_id,
|
||||
phase_type,
|
||||
scheduled_start_time,
|
||||
status,
|
||||
created_by,
|
||||
soaking_duration_minutes,
|
||||
duration_minutes,
|
||||
machine_type_id
|
||||
)
|
||||
VALUES (
|
||||
NEW.id,
|
||||
phase_name,
|
||||
NOW(),
|
||||
'pending',
|
||||
NEW.created_by,
|
||||
NULL,
|
||||
NULL,
|
||||
CASE WHEN phase_name = 'cracking'
|
||||
THEN exp_phase_config.cracking_machine_type_id
|
||||
ELSE NULL END
|
||||
);
|
||||
END LOOP;
|
||||
|
||||
RETURN NEW;
|
||||
END;
|
||||
$$ LANGUAGE plpgsql;
|
||||
|
||||
-- =============================================
|
||||
-- 7. UPDATE FUNCTION: create_sample_experiment_phases (INSERT into experiment_books)
|
||||
-- =============================================
|
||||
|
||||
CREATE OR REPLACE FUNCTION public.create_sample_experiment_phases()
|
||||
RETURNS VOID AS $$
|
||||
DECLARE
|
||||
jc_cracker_id UUID;
|
||||
meyer_cracker_id UUID;
|
||||
BEGIN
|
||||
SELECT id INTO jc_cracker_id FROM public.machine_types WHERE name = 'JC Cracker';
|
||||
SELECT id INTO meyer_cracker_id FROM public.machine_types WHERE name = 'Meyer Cracker';
|
||||
|
||||
INSERT INTO public.experiment_books (name, description, has_soaking, has_airdrying, has_cracking, has_shelling, cracking_machine_type_id, created_by) VALUES
|
||||
('Full Process - JC Cracker', 'Complete pecan processing with JC Cracker', true, true, true, true, jc_cracker_id, (SELECT id FROM public.user_profiles LIMIT 1)),
|
||||
('Full Process - Meyer Cracker', 'Complete pecan processing with Meyer Cracker', true, true, true, true, meyer_cracker_id, (SELECT id FROM public.user_profiles LIMIT 1)),
|
||||
('Cracking Only - JC Cracker', 'JC Cracker cracking process only', false, false, true, false, jc_cracker_id, (SELECT id FROM public.user_profiles LIMIT 1)),
|
||||
('Cracking Only - Meyer Cracker', 'Meyer Cracker cracking process only', false, false, true, false, meyer_cracker_id, (SELECT id FROM public.user_profiles LIMIT 1))
|
||||
ON CONFLICT (name) DO NOTHING;
|
||||
END;
|
||||
$$ LANGUAGE plpgsql SECURITY DEFINER;
|
||||
|
||||
-- =============================================
|
||||
-- 8. UPDATE VIEWS (from 00014 - experiments_with_phases, repetitions_with_phases, experiments_with_all_reps_and_phases, get_experiment_with_reps_and_phases)
|
||||
-- =============================================
|
||||
|
||||
CREATE OR REPLACE VIEW public.experiments_with_phases AS
|
||||
SELECT
|
||||
e.id,
|
||||
e.experiment_number,
|
||||
e.reps_required,
|
||||
e.weight_per_repetition_lbs,
|
||||
e.results_status,
|
||||
e.completion_status,
|
||||
e.phase_id,
|
||||
e.created_at,
|
||||
e.updated_at,
|
||||
e.created_by,
|
||||
ep.name as phase_name,
|
||||
ep.description as phase_description,
|
||||
ep.has_soaking,
|
||||
ep.has_airdrying,
|
||||
ep.has_cracking,
|
||||
ep.has_shelling,
|
||||
er.id as first_repetition_id,
|
||||
er.repetition_number as first_repetition_number,
|
||||
soaking_e.id as soaking_id,
|
||||
soaking_e.scheduled_start_time as soaking_scheduled_start,
|
||||
soaking_e.actual_start_time as soaking_actual_start,
|
||||
soaking_e.soaking_duration_minutes,
|
||||
soaking_e.scheduled_end_time as soaking_scheduled_end,
|
||||
soaking_e.actual_end_time as soaking_actual_end,
|
||||
airdrying_e.id as airdrying_id,
|
||||
airdrying_e.scheduled_start_time as airdrying_scheduled_start,
|
||||
airdrying_e.actual_start_time as airdrying_actual_start,
|
||||
airdrying_e.duration_minutes as airdrying_duration,
|
||||
airdrying_e.scheduled_end_time as airdrying_scheduled_end,
|
||||
airdrying_e.actual_end_time as airdrying_actual_end,
|
||||
cracking_e.id as cracking_id,
|
||||
cracking_e.scheduled_start_time as cracking_scheduled_start,
|
||||
cracking_e.actual_start_time as cracking_actual_start,
|
||||
cracking_e.actual_end_time as cracking_actual_end,
|
||||
mt.name as machine_type_name,
|
||||
shelling_e.id as shelling_id,
|
||||
shelling_e.scheduled_start_time as shelling_scheduled_start,
|
||||
shelling_e.actual_start_time as shelling_actual_start,
|
||||
shelling_e.actual_end_time as shelling_actual_end
|
||||
FROM public.experiments e
|
||||
LEFT JOIN public.experiment_books ep ON e.phase_id = ep.id
|
||||
LEFT JOIN LATERAL (
|
||||
SELECT id, repetition_number
|
||||
FROM public.experiment_repetitions
|
||||
WHERE experiment_id = e.id
|
||||
ORDER BY repetition_number
|
||||
LIMIT 1
|
||||
) er ON true
|
||||
LEFT JOIN public.experiment_phase_executions soaking_e
|
||||
ON soaking_e.repetition_id = er.id AND soaking_e.phase_type = 'soaking'
|
||||
LEFT JOIN public.experiment_phase_executions airdrying_e
|
||||
ON airdrying_e.repetition_id = er.id AND airdrying_e.phase_type = 'airdrying'
|
||||
LEFT JOIN public.experiment_phase_executions cracking_e
|
||||
ON cracking_e.repetition_id = er.id AND cracking_e.phase_type = 'cracking'
|
||||
LEFT JOIN public.experiment_phase_executions shelling_e
|
||||
ON shelling_e.repetition_id = er.id AND shelling_e.phase_type = 'shelling'
|
||||
LEFT JOIN public.machine_types mt ON cracking_e.machine_type_id = mt.id;
|
||||
|
||||
CREATE OR REPLACE VIEW public.repetitions_with_phases AS
|
||||
SELECT
|
||||
er.id,
|
||||
er.experiment_id,
|
||||
er.repetition_number,
|
||||
er.status,
|
||||
er.created_at,
|
||||
er.updated_at,
|
||||
er.created_by,
|
||||
e.experiment_number,
|
||||
e.phase_id,
|
||||
e.weight_per_repetition_lbs,
|
||||
ep.name as phase_name,
|
||||
ep.has_soaking,
|
||||
ep.has_airdrying,
|
||||
ep.has_cracking,
|
||||
ep.has_shelling,
|
||||
soaking_e.scheduled_start_time as soaking_scheduled_start,
|
||||
soaking_e.actual_start_time as soaking_actual_start,
|
||||
soaking_e.soaking_duration_minutes,
|
||||
soaking_e.scheduled_end_time as soaking_scheduled_end,
|
||||
soaking_e.actual_end_time as soaking_actual_end,
|
||||
airdrying_e.scheduled_start_time as airdrying_scheduled_start,
|
||||
airdrying_e.actual_start_time as airdrying_actual_start,
|
||||
airdrying_e.duration_minutes as airdrying_duration,
|
||||
airdrying_e.scheduled_end_time as airdrying_scheduled_end,
|
||||
airdrying_e.actual_end_time as airdrying_actual_end,
|
||||
cracking_e.scheduled_start_time as cracking_scheduled_start,
|
||||
cracking_e.actual_start_time as cracking_actual_start,
|
||||
cracking_e.actual_end_time as cracking_actual_end,
|
||||
mt.name as machine_type_name,
|
||||
shelling_e.scheduled_start_time as shelling_scheduled_start,
|
||||
shelling_e.actual_start_time as shelling_actual_start,
|
||||
shelling_e.actual_end_time as shelling_actual_end
|
||||
FROM public.experiment_repetitions er
|
||||
JOIN public.experiments e ON er.experiment_id = e.id
|
||||
LEFT JOIN public.experiment_books ep ON e.phase_id = ep.id
|
||||
LEFT JOIN public.experiment_phase_executions soaking_e
|
||||
ON er.id = soaking_e.repetition_id AND soaking_e.phase_type = 'soaking'
|
||||
LEFT JOIN public.experiment_phase_executions airdrying_e
|
||||
ON er.id = airdrying_e.repetition_id AND airdrying_e.phase_type = 'airdrying'
|
||||
LEFT JOIN public.experiment_phase_executions cracking_e
|
||||
ON er.id = cracking_e.repetition_id AND cracking_e.phase_type = 'cracking'
|
||||
LEFT JOIN public.experiment_phase_executions shelling_e
|
||||
ON er.id = shelling_e.repetition_id AND shelling_e.phase_type = 'shelling'
|
||||
LEFT JOIN public.machine_types mt ON cracking_e.machine_type_id = mt.id;
|
||||
|
||||
-- experiments_with_all_reps_and_phases
|
||||
CREATE OR REPLACE VIEW public.experiments_with_all_reps_and_phases AS
|
||||
SELECT
|
||||
e.id as experiment_id,
|
||||
e.experiment_number,
|
||||
e.reps_required,
|
||||
e.weight_per_repetition_lbs,
|
||||
e.results_status,
|
||||
e.completion_status,
|
||||
e.phase_id,
|
||||
e.created_at as experiment_created_at,
|
||||
e.updated_at as experiment_updated_at,
|
||||
e.created_by as experiment_created_by,
|
||||
ep.name as phase_name,
|
||||
ep.description as phase_description,
|
||||
ep.has_soaking,
|
||||
ep.has_airdrying,
|
||||
ep.has_cracking,
|
||||
ep.has_shelling,
|
||||
ep.cracking_machine_type_id as phase_cracking_machine_type_id,
|
||||
er.id as repetition_id,
|
||||
er.repetition_number,
|
||||
er.status as repetition_status,
|
||||
er.scheduled_date,
|
||||
er.created_at as repetition_created_at,
|
||||
er.updated_at as repetition_updated_at,
|
||||
er.created_by as repetition_created_by,
|
||||
soaking_e.id as soaking_execution_id,
|
||||
soaking_e.scheduled_start_time as soaking_scheduled_start,
|
||||
soaking_e.actual_start_time as soaking_actual_start,
|
||||
soaking_e.soaking_duration_minutes,
|
||||
soaking_e.scheduled_end_time as soaking_scheduled_end,
|
||||
soaking_e.actual_end_time as soaking_actual_end,
|
||||
soaking_e.status as soaking_status,
|
||||
airdrying_e.id as airdrying_execution_id,
|
||||
airdrying_e.scheduled_start_time as airdrying_scheduled_start,
|
||||
airdrying_e.actual_start_time as airdrying_actual_start,
|
||||
airdrying_e.duration_minutes as airdrying_duration_minutes,
|
||||
airdrying_e.scheduled_end_time as airdrying_scheduled_end,
|
||||
airdrying_e.actual_end_time as airdrying_actual_end,
|
||||
airdrying_e.status as airdrying_status,
|
||||
cracking_e.id as cracking_execution_id,
|
||||
cracking_e.scheduled_start_time as cracking_scheduled_start,
|
||||
cracking_e.actual_start_time as cracking_actual_start,
|
||||
cracking_e.scheduled_end_time as cracking_scheduled_end,
|
||||
cracking_e.actual_end_time as cracking_actual_end,
|
||||
cracking_e.machine_type_id as cracking_machine_type_id,
|
||||
cracking_e.status as cracking_status,
|
||||
mt.name as machine_type_name,
|
||||
shelling_e.id as shelling_execution_id,
|
||||
shelling_e.scheduled_start_time as shelling_scheduled_start,
|
||||
shelling_e.actual_start_time as shelling_actual_start,
|
||||
shelling_e.scheduled_end_time as shelling_scheduled_end,
|
||||
shelling_e.actual_end_time as shelling_actual_end,
|
||||
shelling_e.status as shelling_status
|
||||
FROM public.experiments e
|
||||
LEFT JOIN public.experiment_books ep ON e.phase_id = ep.id
|
||||
LEFT JOIN public.experiment_repetitions er ON er.experiment_id = e.id
|
||||
LEFT JOIN public.experiment_phase_executions soaking_e
|
||||
ON soaking_e.repetition_id = er.id AND soaking_e.phase_type = 'soaking'
|
||||
LEFT JOIN public.experiment_phase_executions airdrying_e
|
||||
ON airdrying_e.repetition_id = er.id AND airdrying_e.phase_type = 'airdrying'
|
||||
LEFT JOIN public.experiment_phase_executions cracking_e
|
||||
ON cracking_e.repetition_id = er.id AND cracking_e.phase_type = 'cracking'
|
||||
LEFT JOIN public.experiment_phase_executions shelling_e
|
||||
ON shelling_e.repetition_id = er.id AND shelling_e.phase_type = 'shelling'
|
||||
LEFT JOIN public.machine_types mt ON cracking_e.machine_type_id = mt.id
|
||||
ORDER BY e.experiment_number, er.repetition_number;
|
||||
|
||||
-- get_experiment_with_reps_and_phases function
|
||||
CREATE OR REPLACE FUNCTION public.get_experiment_with_reps_and_phases(p_experiment_id UUID)
|
||||
RETURNS TABLE (
|
||||
experiment_id UUID,
|
||||
experiment_number INTEGER,
|
||||
phase_name TEXT,
|
||||
repetitions JSONB
|
||||
) AS $$
|
||||
BEGIN
|
||||
RETURN QUERY
|
||||
SELECT
|
||||
e.id,
|
||||
e.experiment_number,
|
||||
ep.name,
|
||||
COALESCE(
|
||||
jsonb_agg(
|
||||
jsonb_build_object(
|
||||
'repetition_id', er.id,
|
||||
'repetition_number', er.repetition_number,
|
||||
'status', er.status,
|
||||
'scheduled_date', er.scheduled_date,
|
||||
'soaking', jsonb_build_object(
|
||||
'scheduled_start', soaking_e.scheduled_start_time,
|
||||
'actual_start', soaking_e.actual_start_time,
|
||||
'duration_minutes', soaking_e.soaking_duration_minutes,
|
||||
'scheduled_end', soaking_e.scheduled_end_time,
|
||||
'actual_end', soaking_e.actual_end_time,
|
||||
'status', soaking_e.status
|
||||
),
|
||||
'airdrying', jsonb_build_object(
|
||||
'scheduled_start', airdrying_e.scheduled_start_time,
|
||||
'actual_start', airdrying_e.actual_start_time,
|
||||
'duration_minutes', airdrying_e.duration_minutes,
|
||||
'scheduled_end', airdrying_e.scheduled_end_time,
|
||||
'actual_end', airdrying_e.actual_end_time,
|
||||
'status', airdrying_e.status
|
||||
),
|
||||
'cracking', jsonb_build_object(
|
||||
'scheduled_start', cracking_e.scheduled_start_time,
|
||||
'actual_start', cracking_e.actual_start_time,
|
||||
'scheduled_end', cracking_e.scheduled_end_time,
|
||||
'actual_end', cracking_e.actual_end_time,
|
||||
'machine_type_id', cracking_e.machine_type_id,
|
||||
'machine_type_name', mt.name,
|
||||
'status', cracking_e.status
|
||||
),
|
||||
'shelling', jsonb_build_object(
|
||||
'scheduled_start', shelling_e.scheduled_start_time,
|
||||
'actual_start', shelling_e.actual_start_time,
|
||||
'scheduled_end', shelling_e.scheduled_end_time,
|
||||
'actual_end', shelling_e.actual_end_time,
|
||||
'status', shelling_e.status
|
||||
)
|
||||
)
|
||||
ORDER BY er.repetition_number
|
||||
),
|
||||
'[]'::jsonb
|
||||
) as repetitions
|
||||
FROM public.experiments e
|
||||
LEFT JOIN public.experiment_books ep ON e.phase_id = ep.id
|
||||
LEFT JOIN public.experiment_repetitions er ON er.experiment_id = e.id
|
||||
LEFT JOIN public.experiment_phase_executions soaking_e
|
||||
ON soaking_e.repetition_id = er.id AND soaking_e.phase_type = 'soaking'
|
||||
LEFT JOIN public.experiment_phase_executions airdrying_e
|
||||
ON airdrying_e.repetition_id = er.id AND airdrying_e.phase_type = 'airdrying'
|
||||
LEFT JOIN public.experiment_phase_executions cracking_e
|
||||
ON cracking_e.repetition_id = er.id AND cracking_e.phase_type = 'cracking'
|
||||
LEFT JOIN public.experiment_phase_executions shelling_e
|
||||
ON shelling_e.repetition_id = er.id AND shelling_e.phase_type = 'shelling'
|
||||
LEFT JOIN public.machine_types mt ON cracking_e.machine_type_id = mt.id
|
||||
WHERE e.id = p_experiment_id
|
||||
GROUP BY e.id, e.experiment_number, ep.name;
|
||||
END;
|
||||
$$ LANGUAGE plpgsql SECURITY DEFINER;
|
||||
|
||||
GRANT SELECT ON public.experiments_with_all_reps_and_phases TO authenticated;
|
||||
GRANT EXECUTE ON FUNCTION public.get_experiment_with_reps_and_phases(UUID) TO authenticated;
|
||||
118
supabase/migrations/00017_experiment_phase_config_tables.sql
Normal file
118
supabase/migrations/00017_experiment_phase_config_tables.sql
Normal file
@@ -0,0 +1,118 @@
|
||||
-- Experiment-level phase config tables
|
||||
-- One row per experiment per phase; linked by experiment_id. Used when creating an experiment
|
||||
-- so soaking, airdrying, cracking, and shelling parameters are stored and can be applied to repetitions.
|
||||
|
||||
-- =============================================
|
||||
-- 1. EXPERIMENT_SOAKING (template for soaking phase)
|
||||
-- =============================================
|
||||
|
||||
CREATE TABLE IF NOT EXISTS public.experiment_soaking (
|
||||
id UUID PRIMARY KEY DEFAULT uuid_generate_v4(),
|
||||
experiment_id UUID NOT NULL REFERENCES public.experiments(id) ON DELETE CASCADE,
|
||||
soaking_duration_hr DOUBLE PRECISION NOT NULL CHECK (soaking_duration_hr >= 0),
|
||||
created_at TIMESTAMP WITH TIME ZONE DEFAULT NOW(),
|
||||
updated_at TIMESTAMP WITH TIME ZONE DEFAULT NOW(),
|
||||
created_by UUID NOT NULL REFERENCES public.user_profiles(id),
|
||||
CONSTRAINT unique_experiment_soaking_per_experiment UNIQUE (experiment_id)
|
||||
);
|
||||
|
||||
CREATE INDEX IF NOT EXISTS idx_experiment_soaking_experiment_id ON public.experiment_soaking(experiment_id);
|
||||
GRANT ALL ON public.experiment_soaking TO authenticated;
|
||||
ALTER TABLE public.experiment_soaking ENABLE ROW LEVEL SECURITY;
|
||||
CREATE POLICY "Experiment soaking config is viewable by authenticated" ON public.experiment_soaking FOR SELECT USING (auth.role() = 'authenticated');
|
||||
CREATE POLICY "Experiment soaking config is insertable by authenticated" ON public.experiment_soaking FOR INSERT WITH CHECK (auth.role() = 'authenticated');
|
||||
CREATE POLICY "Experiment soaking config is updatable by authenticated" ON public.experiment_soaking FOR UPDATE USING (auth.role() = 'authenticated');
|
||||
CREATE POLICY "Experiment soaking config is deletable by authenticated" ON public.experiment_soaking FOR DELETE USING (auth.role() = 'authenticated');
|
||||
|
||||
CREATE TRIGGER set_updated_at_experiment_soaking
|
||||
BEFORE UPDATE ON public.experiment_soaking
|
||||
FOR EACH ROW EXECUTE FUNCTION public.handle_updated_at();
|
||||
|
||||
-- =============================================
|
||||
-- 2. EXPERIMENT_AIRDRYING (template for airdrying phase)
|
||||
-- =============================================
|
||||
|
||||
CREATE TABLE IF NOT EXISTS public.experiment_airdrying (
|
||||
id UUID PRIMARY KEY DEFAULT uuid_generate_v4(),
|
||||
experiment_id UUID NOT NULL REFERENCES public.experiments(id) ON DELETE CASCADE,
|
||||
duration_minutes INTEGER NOT NULL CHECK (duration_minutes >= 0),
|
||||
created_at TIMESTAMP WITH TIME ZONE DEFAULT NOW(),
|
||||
updated_at TIMESTAMP WITH TIME ZONE DEFAULT NOW(),
|
||||
created_by UUID NOT NULL REFERENCES public.user_profiles(id),
|
||||
CONSTRAINT unique_experiment_airdrying_per_experiment UNIQUE (experiment_id)
|
||||
);
|
||||
|
||||
CREATE INDEX IF NOT EXISTS idx_experiment_airdrying_experiment_id ON public.experiment_airdrying(experiment_id);
|
||||
GRANT ALL ON public.experiment_airdrying TO authenticated;
|
||||
ALTER TABLE public.experiment_airdrying ENABLE ROW LEVEL SECURITY;
|
||||
CREATE POLICY "Experiment airdrying config is viewable by authenticated" ON public.experiment_airdrying FOR SELECT USING (auth.role() = 'authenticated');
|
||||
CREATE POLICY "Experiment airdrying config is insertable by authenticated" ON public.experiment_airdrying FOR INSERT WITH CHECK (auth.role() = 'authenticated');
|
||||
CREATE POLICY "Experiment airdrying config is updatable by authenticated" ON public.experiment_airdrying FOR UPDATE USING (auth.role() = 'authenticated');
|
||||
CREATE POLICY "Experiment airdrying config is deletable by authenticated" ON public.experiment_airdrying FOR DELETE USING (auth.role() = 'authenticated');
|
||||
|
||||
CREATE TRIGGER set_updated_at_experiment_airdrying
|
||||
BEFORE UPDATE ON public.experiment_airdrying
|
||||
FOR EACH ROW EXECUTE FUNCTION public.handle_updated_at();
|
||||
|
||||
-- =============================================
|
||||
-- 3. EXPERIMENT_CRACKING (template for cracking; supports JC and Meyer params)
|
||||
-- =============================================
|
||||
|
||||
CREATE TABLE IF NOT EXISTS public.experiment_cracking (
|
||||
id UUID PRIMARY KEY DEFAULT uuid_generate_v4(),
|
||||
experiment_id UUID NOT NULL REFERENCES public.experiments(id) ON DELETE CASCADE,
|
||||
machine_type_id UUID NOT NULL REFERENCES public.machine_types(id) ON DELETE RESTRICT,
|
||||
-- JC Cracker parameters (nullable; used when machine is JC)
|
||||
plate_contact_frequency_hz DOUBLE PRECISION CHECK (plate_contact_frequency_hz IS NULL OR plate_contact_frequency_hz > 0),
|
||||
throughput_rate_pecans_sec DOUBLE PRECISION CHECK (throughput_rate_pecans_sec IS NULL OR throughput_rate_pecans_sec > 0),
|
||||
crush_amount_in DOUBLE PRECISION CHECK (crush_amount_in IS NULL OR crush_amount_in >= 0),
|
||||
entry_exit_height_diff_in DOUBLE PRECISION,
|
||||
-- Meyer Cracker parameters (nullable; used when machine is Meyer)
|
||||
motor_speed_hz DOUBLE PRECISION CHECK (motor_speed_hz IS NULL OR motor_speed_hz > 0),
|
||||
jig_displacement_inches DOUBLE PRECISION CHECK (jig_displacement_inches IS NULL OR jig_displacement_inches >= 0),
|
||||
spring_stiffness_nm DOUBLE PRECISION CHECK (spring_stiffness_nm IS NULL OR spring_stiffness_nm > 0),
|
||||
created_at TIMESTAMP WITH TIME ZONE DEFAULT NOW(),
|
||||
updated_at TIMESTAMP WITH TIME ZONE DEFAULT NOW(),
|
||||
created_by UUID NOT NULL REFERENCES public.user_profiles(id),
|
||||
CONSTRAINT unique_experiment_cracking_per_experiment UNIQUE (experiment_id)
|
||||
);
|
||||
|
||||
CREATE INDEX IF NOT EXISTS idx_experiment_cracking_experiment_id ON public.experiment_cracking(experiment_id);
|
||||
CREATE INDEX IF NOT EXISTS idx_experiment_cracking_machine_type_id ON public.experiment_cracking(machine_type_id);
|
||||
GRANT ALL ON public.experiment_cracking TO authenticated;
|
||||
ALTER TABLE public.experiment_cracking ENABLE ROW LEVEL SECURITY;
|
||||
CREATE POLICY "Experiment cracking config is viewable by authenticated" ON public.experiment_cracking FOR SELECT USING (auth.role() = 'authenticated');
|
||||
CREATE POLICY "Experiment cracking config is insertable by authenticated" ON public.experiment_cracking FOR INSERT WITH CHECK (auth.role() = 'authenticated');
|
||||
CREATE POLICY "Experiment cracking config is updatable by authenticated" ON public.experiment_cracking FOR UPDATE USING (auth.role() = 'authenticated');
|
||||
CREATE POLICY "Experiment cracking config is deletable by authenticated" ON public.experiment_cracking FOR DELETE USING (auth.role() = 'authenticated');
|
||||
|
||||
CREATE TRIGGER set_updated_at_experiment_cracking
|
||||
BEFORE UPDATE ON public.experiment_cracking
|
||||
FOR EACH ROW EXECUTE FUNCTION public.handle_updated_at();
|
||||
|
||||
-- =============================================
|
||||
-- 4. EXPERIMENT_SHELLING (template for shelling phase)
|
||||
-- =============================================
|
||||
|
||||
CREATE TABLE IF NOT EXISTS public.experiment_shelling (
|
||||
id UUID PRIMARY KEY DEFAULT uuid_generate_v4(),
|
||||
experiment_id UUID NOT NULL REFERENCES public.experiments(id) ON DELETE CASCADE,
|
||||
ring_gap_inches NUMERIC(6,2) CHECK (ring_gap_inches IS NULL OR ring_gap_inches > 0),
|
||||
drum_rpm INTEGER CHECK (drum_rpm IS NULL OR drum_rpm > 0),
|
||||
created_at TIMESTAMP WITH TIME ZONE DEFAULT NOW(),
|
||||
updated_at TIMESTAMP WITH TIME ZONE DEFAULT NOW(),
|
||||
created_by UUID NOT NULL REFERENCES public.user_profiles(id),
|
||||
CONSTRAINT unique_experiment_shelling_per_experiment UNIQUE (experiment_id)
|
||||
);
|
||||
|
||||
CREATE INDEX IF NOT EXISTS idx_experiment_shelling_experiment_id ON public.experiment_shelling(experiment_id);
|
||||
GRANT ALL ON public.experiment_shelling TO authenticated;
|
||||
ALTER TABLE public.experiment_shelling ENABLE ROW LEVEL SECURITY;
|
||||
CREATE POLICY "Experiment shelling config is viewable by authenticated" ON public.experiment_shelling FOR SELECT USING (auth.role() = 'authenticated');
|
||||
CREATE POLICY "Experiment shelling config is insertable by authenticated" ON public.experiment_shelling FOR INSERT WITH CHECK (auth.role() = 'authenticated');
|
||||
CREATE POLICY "Experiment shelling config is updatable by authenticated" ON public.experiment_shelling FOR UPDATE USING (auth.role() = 'authenticated');
|
||||
CREATE POLICY "Experiment shelling config is deletable by authenticated" ON public.experiment_shelling FOR DELETE USING (auth.role() = 'authenticated');
|
||||
|
||||
CREATE TRIGGER set_updated_at_experiment_shelling
|
||||
BEFORE UPDATE ON public.experiment_shelling
|
||||
FOR EACH ROW EXECUTE FUNCTION public.handle_updated_at();
|
||||
@@ -566,11 +566,11 @@ INSERT INTO public.machine_types (name, description, created_by) VALUES
|
||||
ON CONFLICT (name) DO NOTHING;
|
||||
|
||||
-- =============================================
|
||||
-- 5. CREATE EXPERIMENT PHASES
|
||||
-- 5. CREATE EXPERIMENT BOOKS (table renamed from experiment_phases in migration 00016)
|
||||
-- =============================================
|
||||
|
||||
-- Create "Phase 2 of JC Experiments" phase
|
||||
INSERT INTO public.experiment_phases (name, description, has_soaking, has_airdrying, has_cracking, has_shelling, cracking_machine_type_id, created_by)
|
||||
-- Create "Phase 2 of JC Experiments" book
|
||||
INSERT INTO public.experiment_books (name, description, has_soaking, has_airdrying, has_cracking, has_shelling, cracking_machine_type_id, created_by)
|
||||
SELECT
|
||||
'Phase 2 of JC Experiments',
|
||||
'Second phase of JC Cracker experiments for pecan processing optimization',
|
||||
@@ -584,8 +584,8 @@ FROM public.user_profiles up
|
||||
WHERE up.email = 's.alireza.v@gmail.com'
|
||||
;
|
||||
|
||||
-- Create "Post Workshop Meyer Experiments" phase
|
||||
INSERT INTO public.experiment_phases (name, description, has_soaking, has_airdrying, has_cracking, has_shelling, cracking_machine_type_id, created_by)
|
||||
-- Create "Post Workshop Meyer Experiments" book
|
||||
INSERT INTO public.experiment_books (name, description, has_soaking, has_airdrying, has_cracking, has_shelling, cracking_machine_type_id, created_by)
|
||||
SELECT
|
||||
'Post Workshop Meyer Experiments',
|
||||
'Post workshop Meyer Cracker experiments for pecan processing optimization',
|
||||
|
||||
@@ -1,7 +1,9 @@
|
||||
-- ==============================================
|
||||
-- 6. CREATE EXPERIMENTS FOR PHASE 2
|
||||
-- ==============================================
|
||||
|
||||
-- TEMPORARILY DISABLED (see config.toml sql_paths). When re-enabling, replace
|
||||
-- all "experiment_phases" with "experiment_books" (table renamed in migration 00016).
|
||||
--
|
||||
-- This seed file creates experiments from phase_2_JC_experimental_run_sheet.csv
|
||||
-- Each experiment has 3 repetitions with specific parameters
|
||||
-- Experiment numbers are incremented by 1 (CSV 0-19 becomes DB 1-20)
|
||||
|
||||
Reference in New Issue
Block a user