155 Commits

Author SHA1 Message Date
salirezav
9bcb5c203e Refactor Experiment and Phase components for improved state management and UI consistency
- Resolved merge conflicts and cleaned up code in ExperimentForm, enhancing initial state handling and validation logic.
- Updated PhaseForm to reflect correct placeholder text for descriptions.
- Simplified Sidebar component by removing unused props and ensuring consistent behavior across expanded and collapsed states.
- Adjusted Scheduling component to streamline repetition management and improve user experience.
2026-03-09 13:53:03 -04:00
Alireza Vaezi
97ded48d54 gitignore updates 2026-03-09 13:10:12 -04:00
Alireza Vaezi
cd55dda295 Merge remote-tracking branch 'old-github/main' into integrate-old-refactors-of-github 2026-03-09 13:10:08 -04:00
Alireza Vaezi
9e97893c89 Merge pull request #3 from salirezav/db-fixes-and-patches
Db fixes and patches
2026-03-09 12:51:23 -04:00
Alireza Vaezi
0882d8d484 Merge branch 'main' into db-fixes-and-patches 2026-03-09 12:50:57 -04:00
salirezav
0a2b24fdbf Refactor Experiment components to support new experiment book structure
- Updated ExperimentForm to handle additional phase parameters and improved initial state management.
- Modified ExperimentModal to fetch experiment data with phase configuration and ensure unique experiment numbers within the same phase.
- Renamed references from "phases" to "books" across ExperimentPhases, PhaseExperiments, and related components for consistency with the new terminology.
- Enhanced error handling and validation for new shelling parameters in ExperimentForm.
- Updated Supabase interface definitions to reflect changes in experiment and phase data structures.
2026-03-09 12:43:23 -04:00
salirezav
38a7846e7b Update .gitignore and refactor docker-compose.sh for improved host IP detection
- Added .host-ip to .gitignore to prevent tracking of host IP configurations.
- Enhanced docker-compose.sh to prioritize HOST_IP environment variable, fallback to .host-ip file, and finally auto-detect the host IP, improving robustness in various environments.
- Updated documentation to reflect changes in Supabase migration paths and removed references to deprecated management-dashboard-web-app directory.
2026-02-09 13:21:08 -05:00
salirezav
6a8f238bee fixed scheduling not loading unless logged 2026-02-09 13:20:40 -05:00
salirezav
1452a42ef4 Merge branch 'main' of https://github.com/salirezav/usda-vision 2026-02-02 13:05:29 -05:00
salirezav
5da5347443 Merge branch 'fix/scheduling-ui-ux' 2026-02-02 12:38:21 -05:00
salirezav
c8cd95361a Update README.md to reflect new services and development setup
- Added descriptions for new services: media-api, video-remote, vision-system-remote, and scheduling-remote.
- Updated development instructions to use docker-compose for starting the stack.
- Changed web port from 5173 to 8080 and clarified development commands.
- Removed MQTT broker details and updated service sections accordingly.
2026-02-02 12:34:59 -05:00
salirezav
309659c65d Update .gitignore to exclude Supabase CLI temporary files
- Added entries to ignore the Supabase CLI temporary files located in management-dashboard-web-app/supabase/.temp/cli-latest and supabase/.temp/cli-latest, preventing unnecessary clutter in version control.
2026-02-02 11:30:43 -05:00
salirezav
df6c849ca4 db fixes and patches: phase data tables and unified phase executions migrations, database entities doc 2026-02-02 11:27:53 -05:00
salirezav
49ddcfd002 Refactor docker-compose setup and enhance scheduling components
- Re-enabled Vision API and Media API services in docker-compose.yml, providing necessary configurations for development.
- Improved scheduling logic in HorizontalTimelineCalendar and Scheduling components to better manage repetition visibility and database scheduling status.
- Updated docker-compose-reset.sh to conditionally wait for Supabase services, enhancing the setup process for local development.
- Added isScheduledInDb prop to manage UI states for scheduled repetitions, improving user experience in the scheduling interface.
2026-02-02 11:25:37 -05:00
Hunter Halloran
78bfcf0261 Mount Nix camera SDK into docker container with proper LD_LIBRARY_PATH 2026-01-30 18:07:18 -05:00
Hunter Halloran
194f3fbd9a Fix Nix string concatenation in preStart 2026-01-30 17:55:19 -05:00
Hunter Halloran
147c21a19b Fix missing 'fi' in shell script 2026-01-30 17:53:49 -05:00
Hunter Halloran
5cb5a78032 Fix env_file paths and ANON_KEY references in docker-compose.yml
- Update sed pattern to correctly match absolute env_file paths
- Replace [REDACTED] placeholders with VITE_SUPABASE_ANON_KEY variable reference
- Fix missing file path in sed command
2026-01-30 17:52:17 -05:00
Hunter Halloran
53314a0896 fix: Link .env file correctly (I hope) 2026-01-30 17:38:03 -05:00
Hunter Halloran
998a84f992 Fix module.nix package references 2026-01-30 17:26:16 -05:00
Hunter Halloran
59d3a1eec1 Move usda-vision module to flake nixosModules output 2026-01-30 17:24:14 -05:00
Hunter Halloran
dce72a6ab9 feat: Export NixOS Module for usda-vision service config 2026-01-30 17:18:06 -05:00
Hunter Halloran
065e5f368f fix: Move ragenix to externally managed, and ask for env file references 2026-01-30 12:48:48 -05:00
Hunter Halloran
20a01c89af fix: Move ragenix to externally managed, and ask for env file references 2026-01-30 12:48:48 -05:00
Hunter
3434082584 Merge pull request #2 from salirezav/feat/nix-package
feat: Add flake and ragenix package generation and dev environment
2026-01-30 12:20:19 -05:00
Hunter
f7c1d86244 Merge pull request #2 from salirezav/feat/nix-package
feat: Add flake and ragenix package generation and dev environment
2026-01-30 12:20:19 -05:00
Hunter Halloran
cfa31347c6 feat: Add flake and ragenix package generation and dev environment 2026-01-30 12:05:35 -05:00
Hunter Halloran
b77bca6f0a feat: Add flake and ragenix package generation and dev environment 2026-01-30 12:05:35 -05:00
Hunter
8cb234e959 Merge pull request #1 from salirezav/oauth2-login
feat: Add Oauth2 login support
2026-01-30 11:52:14 -05:00
Hunter
59ad94bba2 Merge pull request #1 from salirezav/oauth2-login
feat: Add Oauth2 login support
2026-01-30 11:52:14 -05:00
salirezav
780a95549b Enhance ErrorBoundary and improve repetition placement logic in scheduling components
- Updated ErrorBoundary to include auto-retry functionality with configurable retry parameters.
- Refined repetition placement logic in HorizontalTimelineCalendar to handle overlaps more accurately, ensuring better visual representation of scheduling data.
- Added comments for clarity on sorting and overlap checks within the repetition handling process.
2026-01-14 16:41:10 -05:00
salirezav
c54385a90c Enhance ErrorBoundary component with auto-retry functionality
- Added auto-retry feature to ErrorBoundary for handling errors during module federation loading.
- Introduced props for configuring retry behavior: autoRetry, retryDelay, and maxRetries.
- Implemented retry count management and UI feedback for ongoing retries.
- Updated component lifecycle methods to manage retries and cleanup effectively.
- Refactored handleRetry method to reset retry count upon manual retry.
2026-01-14 16:41:03 -05:00
salirezav
afefd32c3b Refactor development workflow for scheduling-remote
- Updated dev:watch script in package.json to streamline the build and serve process.
- Removed start-dev.sh script as its functionality is now integrated into the npm command.
2026-01-14 16:28:07 -05:00
salirezav
325dc89c51 Enhance HorizontalTimelineCalendar and Scheduling components with repetition metadata and improved interaction
- Added RepetitionMetadata interface to manage phase details for repetitions.
- Implemented onScrollToRepetition and onScheduleRepetition callbacks for better user navigation and scheduling.
- Updated HorizontalTimelineCalendar to display phase names, experiment numbers, and repetition numbers in the timeline.
- Removed locked state management from Scheduling component, simplifying repetition handling.
- Improved conductor assignment visibility and interaction within the scheduling interface.
2026-01-14 16:04:51 -05:00
salirezav
0a4df9073c Refactor Sidebar component to remove hover state management and adjust expansion logic
- Removed isHovered and setIsHovered props from SidebarProps.
- Updated sidebar expansion logic to rely solely on isExpanded and isMobileOpen.
- Simplified rendering conditions for menu items and submenus based on the new state management.
2026-01-14 16:04:45 -05:00
salirezav
87ff14705e Disable Vision API and Media API services in docker-compose.yml for development; add start-dev.sh script for scheduling-remote to streamline development workflow. 2026-01-14 16:04:39 -05:00
salirezav
3eeaa72145 Implement sidebar state persistence in DashboardLayout
- Changed initial state of sidebar expansion to false.
- Added functions to save and retrieve sidebar state from localStorage.
- Updated useEffect to load saved sidebar state on component mount.
- Modified toggleSidebar function to save the new state after toggling.
2026-01-14 16:04:28 -05:00
salirezav
17619dce47 Merge remote-tracking branch 'origin/main' into fix/scheduling-ui-ux 2026-01-13 14:46:11 -05:00
salirezav
0c434e7e7f Refactor HorizontalTimelineCalendar and Scheduling components for improved functionality
- Updated HorizontalTimelineCalendar to support full 24-hour scheduling with enhanced marker positioning and dragging capabilities.
- Introduced extendLeft and extendRight properties in RepetitionBorder for better visual representation of markers extending beyond their borders.
- Enhanced tooltip functionality for vertical lines in the timeline, providing clearer time information.
- Modified Scheduling component to allow scheduling from midnight to midnight, improving user experience in time selection.
- Adjusted Docker script permissions for better execution control.
2026-01-13 14:41:48 -05:00
Hunter Halloran
83084158b5 feat: Enable UGA SSO with Microsoft Entra 2026-01-13 13:47:33 -05:00
Hunter Halloran
f625a3e9e1 feat: Enable UGA SSO with Microsoft Entra 2026-01-13 13:47:33 -05:00
Hunter Halloran
32504f7196 feat: Add Azure external auth provider 2026-01-09 12:52:42 -05:00
Hunter Halloran
0b2c698ea5 feat: Add Azure external auth provider 2026-01-09 12:52:42 -05:00
Hunter Halloran
94e618bf91 feat: Begin support for OIDC login 2026-01-09 12:17:00 -05:00
Hunter Halloran
d09fddf960 feat: Begin support for OIDC login 2026-01-09 12:17:00 -05:00
UGA Innovation Factory
5fdc02d2fc update 2025-12-19 18:35:42 -05:00
UGA Innovation Factory
d6b0d5bd89 round bottom border on password 2025-12-19 18:20:11 -05:00
UGA Innovation Factory
4b3a66a02c round bottom border on password 2025-12-19 18:14:59 -05:00
UGA Innovation Factory
0d3c80aa6d fix for password too 2025-12-19 18:11:33 -05:00
UGA Innovation Factory
e60b3d61f4 actually fix text color 2025-12-19 18:07:57 -05:00
UGA Innovation Factory
ae5e6670e0 apply text color to body 2025-12-19 17:52:45 -05:00
UGA Innovation Factory
ab9c164b51 respect font import better with safari 2025-12-19 17:40:41 -05:00
UGA Innovation Factory
c7e409e571 updated dark color scheme to show text on login 2025-12-19 17:29:28 -05:00
UGA Innovation Factory
5ec8e3d9de add external url to allowed hosts 2025-12-19 16:05:48 -05:00
salirezav
c5d5956daf feat: Add local Supabase configuration for Vite development
- Introduced a new backup environment file containing local Supabase and Vision API configurations.
- Configured environment variables for Supabase URL, anonymous key, and various Vision API settings.
- Enabled modules for video, vision system, and scheduling in the development environment.
2025-12-18 20:17:14 -05:00
salirezav
8f4225a62e feat: Add dynamic host IP detection for Docker Compose and Supabase config
- Add docker-compose.sh wrapper script that auto-detects host IP
- Update docker-compose.yml to use environment variable substitution
- Update Supabase config.toml files to use HOST_SITE_URL and SUPABASE_API_URL env vars
- Add scripts/get-host-ip.sh for IP detection
- Add scripts/set-host-env.sh for environment setup
- Add scripts/supabase-with-env.sh wrapper for Supabase CLI
- Add documentation for Docker Compose environment setup
- Update README.md with new usage instructions
- Replace hardcoded URLs with dynamic environment variables
2025-12-18 19:57:27 -05:00
salirezav
8cb45cbe03 Refactor Supabase services in docker-compose.yml for better organization and testing
- Commented out all Supabase services to facilitate testing with Supabase CLI.
- Updated README to include Supabase directory in project structure.
- Adjusted documentation for migration paths in Supabase Docker Compose guide.
- Enhanced docker-compose-reset.sh to explicitly remove Supabase volumes and wait for migrations to complete.
2025-12-18 18:27:04 -05:00
salirezav
93c68768d8 feat: Integrate Supabase containers into main docker-compose
- Add all Supabase services (db, rest, auth, realtime, storage, studio, meta, inbucket)
- Add migration runner service to automatically run migrations on startup
- Configure all services to use shared network for inter-service communication
- Add documentation for Supabase docker-compose integration
- Add helper script for generating Supabase secrets
- Update web service to connect to Supabase via network
2025-12-18 15:59:24 -05:00
salirezav
8d8b639a35 Enhance AvailabilityCalendar component with loading state, toast notifications, and delete confirmation modal
- Added loading state to indicate data fetching progress.
- Implemented toast notifications for success and error messages during availability operations.
- Introduced a delete confirmation modal for improved user experience when removing availability slots.
- Enhanced TimeSlotModal with validation error handling and loading indicators for adding time slots.
2025-12-18 15:56:20 -05:00
salirezav
6cf67822dc Commit changes before merging to main 2025-12-18 14:33:05 -05:00
salirezav
9159ab68f3 Remove CalendarStyles.css and Scheduling.tsx components, updating the project structure for improved maintainability. Update Supabase CLI version and modify experiment_repetitions SQL migration to include scheduled_date. Enhance scheduling component in the remote app with improved drag-and-drop functionality and UI adjustments for better user experience. 2025-12-12 12:53:52 -05:00
salirezav
bada5a073d Enhance scheduling component and improve data handling
- Added conductorsExpanded state to manage the visibility of the conductors list in the scheduling component.
- Updated color assignment logic for conductors to ensure consistent coloring based on their position in the array.
- Modified spawnSingleRepetition function to accept an updated set of selected IDs for accurate stagger calculations.
- Refactored soaking and airdrying data retrieval to map results from a unified experiment_phase_executions table, improving data consistency.
- Enhanced UI for conductor selection, including a collapsible list and improved availability indicators.
2025-12-05 11:10:21 -05:00
salirezav
933d4417a5 Update Docker configuration, enhance error handling, and improve logging
- Added health check to the camera management API service in docker-compose.yml for better container reliability.
- Updated installation scripts in Dockerfile to check for existing dependencies before installation, improving efficiency.
- Enhanced error handling in the USDAVisionSystem class to allow partial operation if some components fail to start, preventing immediate shutdown.
- Improved logging throughout the application, including more detailed error messages and critical error handling in the main loop.
- Refactored WebSocketManager and CameraMonitor classes to use debug logging for connection events, reducing log noise.
2025-12-03 17:23:31 -05:00
salirezav
2bce817b4e Enhance media API with video file validation and Docker configuration update
- Added a function to check if video files are complete and valid using ffprobe, preventing errors during thumbnail generation.
- Updated thumbnail generation logic to skip incomplete or corrupted files, improving robustness.
- Modified docker-compose.yml to include a restart policy for the camera management API service, ensuring better container reliability.
2025-12-03 14:56:18 -05:00
salirezav
d454c64168 Add background thumbnail generator to media API
- Implemented a background worker for generating video thumbnails, enhancing media processing capabilities.
- Added startup and shutdown events to manage the thumbnail generator's lifecycle.
- Refactored thumbnail generation logic to handle file processing more robustly, including checks for existing thumbnails and file accessibility.
- Updated existing functions to integrate with the new background processing approach, ensuring seamless thumbnail generation on demand.
2025-12-02 12:34:00 -05:00
salirezav
cb48020932 Enhance CameraPage UI and improve loading indicators
- Updated loading indicators with larger sizes and improved animations for better visibility.
- Enhanced error banner styling and increased padding for a more user-friendly experience.
- Adjusted stream status and recording status sections for improved readability and consistency in font sizes.
- Refined MQTT message log display with better spacing and text sizes for clearer information presentation.
- Improved overall layout and styling of the CameraPage component for a more polished user interface.
2025-12-01 15:40:32 -05:00
salirezav
b3a94d2d4f Enhance Docker Compose configuration and improve camera manager error handling
- Added container names for better identification of services in docker-compose.yml.
- Refactored CameraManager to include error handling during initialization of camera recorders and streamers, ensuring the system remains operational even if some components fail.
- Updated frontend components to support new MQTT Debug Panel functionality, enhancing monitoring capabilities.
2025-12-01 15:30:10 -05:00
salirezav
73849b40a8 Add MQTT publish request and response models, and implement publish route
- Introduced MQTTPublishRequest and MQTTPublishResponse models for handling MQTT message publishing.
- Implemented a new POST route for publishing MQTT messages, including error handling and logging.
- Enhanced the StandaloneAutoRecorder with improved logging during manual recording start.
- Updated the frontend to include an MQTT Debug Panel for better monitoring and debugging capabilities.
2025-12-01 13:07:36 -05:00
salirezav
5070d9b2ca Enhance media API transcoding and video streaming capabilities
- Added support for limiting concurrent transcoding operations in the media API to prevent resource exhaustion.
- Implemented functions to retrieve video duration and bitrate using ffprobe for improved streaming performance.
- Enhanced the generate_transcoded_stream function to handle HTTP range requests, allowing for more efficient video playback.
- Updated VideoModal component to disable fluid and responsive modes, ensuring proper container boundaries during video playback.
- Improved logging throughout the transcoding process for better error tracking and debugging.
2025-11-04 11:55:27 -05:00
salirezav
de46753f15 Update session summary and clean up unused files
- Added additional notes to SESSION_SUMMARY.md regarding MQTT debugging and enhanced logging.
- Removed outdated SQL seed files related to Phase 2 JC Experiments and Meyer Experiments to streamline the codebase.
- Updated the CLI version in the Supabase configuration for consistency.
- Cleaned up test files in the camera management API to improve maintainability.
2025-11-03 23:07:22 -05:00
salirezav
4acad772f9 Update camera management and MQTT logging for improved functionality
- Changed log level in configuration from WARNING to INFO for better visibility of system operations.
- Enhanced StandaloneAutoRecorder initialization to accept camera manager, state manager, and event system for improved modularity.
- Updated recording routes to handle optional request bodies and improved error logging for better debugging.
- Added checks in CameraMonitor to determine if a camera is already in use before initialization, enhancing resource management.
- Improved MQTT client logging to provide more detailed connection and message handling information.
- Added new MQTT event handling capabilities to the VisionApiClient for better tracking of machine states.
2025-11-03 16:56:53 -05:00
salirezav
868aa3f036 Add scheduling-remote service to docker-compose and enhance camera error handling
- Introduced a new service for scheduling-remote in docker-compose.yml, allowing for better management of scheduling functionalities.
- Enhanced error handling in CameraMonitor and CameraStreamer classes to improve robustness during camera initialization and streaming processes.
- Updated various components in the management dashboard to support dark mode and improve user experience with consistent styling.
- Implemented feature flags for enabling/disabling modules, including the new scheduling module.
2025-11-02 19:33:13 -05:00
salirezav
f6a37ca1ba Remove deprecated files and scripts to streamline the codebase
- Deleted unused API test files, RTSP diagnostic scripts, and development utility scripts to reduce clutter.
- Removed outdated database schema and modularization proposal documents to maintain focus on current architecture.
- Cleaned up configuration files and logging scripts that are no longer in use, enhancing project maintainability.
2025-11-02 10:07:59 -05:00
salirezav
f1a9cb0c1e Refactor API route setup and enhance modularity
- Consolidated API route definitions by registering routes from separate modules for better organization and maintainability.
- Removed redundant route definitions from the APIServer class, improving code clarity.
- Updated camera monitoring and recording modules to utilize a shared context manager for suppressing camera SDK errors, enhancing error handling.
- Adjusted timeout settings in camera operations for improved reliability during frame capture.
- Enhanced logging and error handling across camera operations to facilitate better debugging and monitoring.
2025-11-01 15:53:01 -04:00
salirezav
1a8aa8a027 RTSP Fully Implemented 2025-11-01 14:58:25 -04:00
salirezav
43e1dace8c Enhance camera recording functionality with streamer integration
- Updated CameraRecorder to support frame sharing from CameraStreamer, allowing for more efficient video recording.
- Modified CameraManager to ensure streamer references are correctly assigned to recorders.
- Enhanced CameraStreamer to include a recording frame queue for concurrent access during recording.
- Improved logging for better tracking of recording states and streamer activity.
- Updated API tests to include new functionality for retrieving video lists.
2025-11-01 13:49:16 -04:00
salirezav
3cb6a8e76e Refactor MediaMTX configuration and enhance RTSP streaming logging
- Removed outdated timeout settings from MediaMTX configuration for improved stream handling.
- Updated CameraStreamer class to include detailed logging for RTSP streaming state and frame dimension checks.
- Added environment variable support for configuring MediaMTX host, enhancing flexibility in deployment.
2025-11-01 12:58:30 -04:00
salirezav
b7adc3788a Implement RTSP streaming functionality for cameras
- Added endpoints to start and stop RTSP streaming for cameras in the API.
- Enhanced CameraManager and CameraStreamer classes to manage RTSP streaming state and processes.
- Updated API documentation to include new RTSP streaming commands.
- Modified Docker configurations to include FFmpeg for RTSP streaming support.
- Adjusted MediaMTX settings for improved stream handling and timeout configurations.
2025-11-01 12:35:25 -04:00
salirezav
70f614e9ff Enhance video streaming capabilities and UI integration
- Added support for streaming video files with proper MIME type handling in the media API.
- Implemented transcoding functionality for H.264 compatibility on-the-fly using FFmpeg.
- Updated VideoModal component to utilize Video.js for improved video playback experience.
- Enhanced user interface with download options and better error handling for video playback.
- Updated package.json and package-lock.json to include new dependencies for video.js and related types.
2025-10-31 18:29:05 -04:00
salirezav
00d4e5b275 Enhance video remote service and UI components
- Updated docker-compose.yml to include new media-api and mediamtx services for improved video handling.
- Modified package.json and package-lock.json to add TypeScript types for React and React DOM.
- Refactored video-related components (VideoCard, VideoList, VideoModal) for better user experience and responsiveness.
- Improved FiltersBar and Pagination components with enhanced styling and functionality.
- Added loading and error states in VideoList for better user feedback during data fetching.
- Enhanced CSS styles for a more polished look across the application.
2025-10-31 18:06:40 -04:00
salirezav
0b724fe59b Refactor video streaming feature and update dependencies
- Replaced npm ci with npm install in docker-compose for better package management.
- Introduced remote component loading for the VideoStreamingPage with error handling.
- Updated the title in index.html to "Experiments Dashboard" for clarity.
- Added new video remote service configuration in docker-compose for improved integration.
- Removed deprecated files and components related to the video streaming feature to streamline the codebase.
- Updated package.json and package-lock.json to include @originjs/vite-plugin-federation for module federation support.
2025-10-30 15:36:19 -04:00
salirezav
9f669e7dff Enhance scheduling and drag-and-drop functionality in the Calendar component
- Improved drag-and-drop experience for event scheduling with visual feedback and better cursor styles.
- Added state management for tracking repetitions, including locked schedules and currently scheduling repetitions.
- Implemented re-staggering logic to prevent overlap of scheduled events.
- Enhanced event generation to include time points for soaking, airdrying, and cracking phases.
- Updated the calendar to preserve and restore scroll position during event updates.
- Refactored event handling to ensure smooth interaction and improved user experience.
2025-10-29 14:16:19 -04:00
salirezav
98c93f9e0e Change video storage directory from /storage to /mnt/nfs_share
- Update StorageConfig default base_path in core/config.py
- Update base_path and camera storage_paths in config.json and config.compose.json
- Update Docker Compose volume mounts to use /mnt/nfs_share
- Update start_system.sh to create /mnt/nfs_share directory
- Update convert_avi_to_mp4.sh to use new NFS path
- Update all documentation files to reflect new storage paths
- Videos now stored on NFS server at 192.168.1.249:/mnt/nfs_share/
2025-10-14 16:28:00 -04:00
salirezav
e675423258 Remove deprecated CSV files and update experiment seeding scripts
- Deleted unused CSV files: 'meyer experiments.csv' and 'phase_2_experimental_run_sheet.csv'.
- Updated SQL seed scripts to reflect changes in experiment data structure and ensure consistency with the latest experiment parameters.
- Enhanced user role assignments in the seed data to include 'conductor' alongside 'data recorder'.
- Adjusted experiment seeding logic to align with the corrected data from the CSV files.
2025-09-28 21:10:50 -04:00
salirezav
853cec1b13 Add drag-and-drop scheduling functionality to the Scheduling component
- Integrated react-dnd and react-dnd-html5-backend for drag-and-drop capabilities.
- Enhanced the Scheduling component to allow users to visually manage experiment repetitions on the calendar.
- Added state management for scheduled repetitions and their timing.
- Implemented select-all checkboxes for conductors and repetitions for improved user experience.
- Updated calendar event generation to include new repetition markers with distinct styles.
- Refactored event handling to support draggable repetition markers and update their timing dynamically.
2025-09-24 21:23:38 -04:00
salirezav
aaeb164a32 Refactor experiment management and update data structures
- Renamed columns in the experimental run sheet CSV for clarity.
- Updated the ExperimentForm component to include new fields for weight per repetition and additional parameters specific to Meyer Cracker experiments.
- Enhanced the data entry logic to handle new experiment phases and machine types.
- Refactored repetition scheduling logic to use scheduled_date instead of schedule_status for better clarity in status representation.
- Improved the user interface for displaying experiment phases and their associated statuses.
- Removed outdated seed data and updated database migration scripts to reflect the new schema changes.
2025-09-24 14:27:28 -04:00
salirezav
08538c87c3 Implement availability management in Scheduling component
- Added functionality to load user availability from the database on component mount.
- Integrated create and delete availability features with the backend.
- Refactored event handling to manage availability slots dynamically.
- Updated supabase.ts to include methods for fetching, creating, and deleting availability records.
2025-09-22 11:34:58 -04:00
salirezav
44c8c3f6dd Update environment configuration and enhance user management features
- Changed VITE_SUPABASE_URL in .env.example for deployment consistency.
- Added new user management functionality to reset user passwords in UserManagement component.
- Updated supabase.ts to include first and last name fields in user profiles and added password reset functionality.
- Enhanced DashboardLayout to include a user profile view and improved user display in TopNavbar.
- Updated seed.sql to create additional users with roles for testing purposes.
2025-09-22 11:20:15 -04:00
salirezav
0ba385eebc update scheduling 2025-09-19 12:46:03 -04:00
salirezav
4e0e9f9d3f Update modal backdrop blur effect and Vite configuration
- Changed backdrop blur effect in multiple modal components from 32px to 2px for a more subtle appearance.
- Updated Vite configuration to use automatic JSX runtime for improved compatibility with React 17 and above.
2025-09-19 12:43:37 -04:00
salirezav
ed6c242faa Enhance scheduling features in management dashboard
- Added new scheduling functionality with a dedicated Scheduling component to manage availability and experiment scheduling.
- Integrated react-big-calendar for visual calendar representation of availability slots.
- Updated Dashboard and DashboardLayout components to handle current route and pass it to child components.
- Implemented route handling for scheduling sub-routes to improve user navigation.
- Added new dependencies: moment and react-big-calendar for date handling and calendar UI.
- Improved user experience with dynamic URL updates based on selected scheduling views.
2025-09-19 12:33:25 -04:00
salirezav
d1fe478478 Refactor: enhance dashboard layout and experiment management features
- Added functionality to save and retrieve the current dashboard view in localStorage for improved user experience.
- Updated DashboardLayout component to handle view changes with access control based on user roles.
- Renamed Experiments component to ExperimentManagement for clarity.
- Introduced new ExperimentPhase interface and related utility functions for managing experiment phases.
- Updated seed data to include initial roles and experiment phases for testing.
- Cleaned up unnecessary blank lines in various files for better code readability.
2025-09-19 12:03:46 -04:00
salirezav
843071eda7 Remove unnecessary blank lines in camera-test.html for cleaner code. 2025-09-11 14:28:54 -04:00
salirezav
f8d720964e Update API endpoints for camera2 and enhance logging options in USDA Vision System
- Changed camera1 references to camera2 in API endpoint documentation for start and stop recording.
- Added debug and verbose logging options in the USDA Vision System to improve debugging capabilities.
- Updated TopNavbar component for improved user experience with cursor pointer styling.
2025-09-11 14:25:20 -04:00
salirezav
8e7b5b054f Add development scripts and Docker Compose configuration for local environment setup
- Introduced `dev-start.sh`, `dev-stop.sh`, `dev-logs.sh`, and `dev-shell.sh` for managing the development environment.
- Added `docker-compose.dev.yml` to define services for API and web applications with appropriate configurations.
- Updated `README.md` to include instructions for development mode and commands for managing the environment.
2025-09-11 14:24:21 -04:00
salirezav
5bdb070173 Enhance camera management features: add debug endpoint for camera manager state, implement live camera routes without authentication, and improve logging for camera initialization and status checks. Update Docker configuration to include environment variables for the web app. 2025-09-02 15:31:47 -04:00
salirezav
62dd0d162b Refactor: enhance API response schemas for pagination; update environment variables for Supabase and Vision API; improve Vite configuration for proxy routing 2025-08-12 13:48:17 -04:00
salirezav
7a939920fa Chore: update environment configuration for local development; modify Dockerfile to streamline SDK installation and enhance startup script for better directory handling; add time verification script for system time synchronization 2025-08-08 16:18:37 -04:00
salirezav
20907509b1 Refactor camera management to conditionally import SDK and handle mock mode; update API base URL references to localhost in documentation and code. 2025-08-08 13:20:31 -04:00
Alireza Vaezi
fc2da16728 Chore: rename api->camera-management-api and web->management-dashboard-web-app; update compose, ignore, README references 2025-08-07 22:07:25 -04:00
Alireza Vaezi
28dab3a366 Build: ensure bash available for vendor install.sh 2025-08-07 21:28:12 -04:00
Alireza Vaezi
4c9e9445e4 Chore: minor compose tweaks for SDK install at runtime and env 2025-08-07 21:27:06 -04:00
Alireza Vaezi
00ea41e11c Build: install Huateng SDK from local tarball in api image; compose builds api image and sets LD_LIBRARY_PATH 2025-08-07 21:27:00 -04:00
Alireza Vaezi
cf0bb48fac API Dockerfile: install camera SDK from local tarball under camera_sdk; compose: build API image and set LD_LIBRARY_PATH fallback 2025-08-07 21:25:58 -04:00
Alireza Vaezi
458c04608c Dev: run full system via main.py in compose; add config.compose.json pointing MQTT to 'mqtt' service and map /storage 2025-08-07 21:07:07 -04:00
Alireza Vaezi
6255279d0d Docs: add subtree workflow and Docker Compose quick start to README 2025-08-07 21:03:12 -04:00
Alireza Vaezi
2b25413040 Add .env.example for local dev env vars 2025-08-07 20:58:54 -04:00
Alireza Vaezi
eaf509fbf8 Add api/ and web/ via git subtree; add docker-compose.yml and unified .gitignore 2025-08-07 20:58:28 -04:00
Alireza Vaezi
0b0e575080 Add 'web/' from commit '81828f61cf893039b89d3cf1861555f31167c37d'
git-subtree-dir: web
git-subtree-mainline: 7dbb36d619
git-subtree-split: 81828f61cf
2025-08-07 20:57:47 -04:00
Alireza Vaezi
7dbb36d619 Add 'api/' from commit '14ac229098e65aa643f84e8e17e0c5f1aaf8d639'
git-subtree-dir: api
git-subtree-mainline: 4743f19aef
git-subtree-split: 14ac229098
2025-08-07 20:57:34 -04:00
Alireza Vaezi
81828f61cf feat(video-streaming): add ApiStatusIndicator, PerformanceDashboard, VideoDebugger, and VideoErrorBoundary components
- Implemented ApiStatusIndicator to monitor video API connection status with health check functionality.
- Created PerformanceDashboard for monitoring video streaming performance metrics in development mode.
- Developed VideoDebugger for diagnosing video streaming issues with direct access to test video URLs.
- Added VideoErrorBoundary to handle errors in video streaming components with user-friendly messages and recovery options.
- Introduced utility functions for performance monitoring and thumbnail caching to optimize video streaming operations.
- Added comprehensive tests for video streaming API connectivity and functionality.
2025-08-06 11:46:25 -04:00
Alireza Vaezi
14ac229098 streaming fix 2025-08-05 15:55:48 -04:00
Alireza Vaezi
07e8e52503 Add comprehensive video streaming module and AI agent integration guide
- Introduced a new video streaming module with endpoints for listing, streaming, and managing video files.
- Added detailed API documentation for video streaming, including features like HTTP range requests, thumbnail generation, and caching.
- Created a new AI Agent Video Integration Guide with step-by-step instructions for integrating with the video streaming API.
- Implemented a video reindexing script to update video statuses from "unknown" to "completed".
- Enhanced error handling and logging throughout the video streaming and reindexing processes.
2025-08-05 14:44:31 -04:00
Alireza Vaezi
228efb0f55 feat: add Pagination component for video list navigation
- Implemented a reusable Pagination component with first/last, previous/next, and numbered page buttons.
- Added PageInfo component to display current page and total items.
- Integrated pagination into VideoList component, allowing users to navigate through video pages.
- Updated useVideoList hook to manage current page and total pages state.
- Modified videoApi service to support pagination with offset-based API.
- Enhanced VideoCard styling for better UI consistency.
- Updated Tailwind CSS configuration to include custom colors and shadows for branding.
- Refactored video file settings to use 'h264' codec for better compatibility.
2025-08-05 13:56:26 -04:00
Alireza Vaezi
14757807aa Enhance video file handling: support multiple formats (AVI, MP4, WEBM) in storage manager 2025-08-05 13:53:28 -04:00
Alireza Vaezi
37553163db Implement video processing module with FFmpeg conversion, OpenCV metadata extraction, and file system repository
- Added FFmpegVideoConverter for video format conversion using FFmpeg.
- Implemented NoOpVideoConverter for scenarios where FFmpeg is unavailable.
- Created OpenCVMetadataExtractor for extracting video metadata.
- Developed FileSystemVideoRepository for managing video files in the file system.
- Integrated video services with dependency injection in VideoModule.
- Established API routes for video management and streaming.
- Added request/response schemas for video metadata and streaming information.
- Implemented caching mechanisms for video streaming.
- Included error handling and logging throughout the module.
2025-08-04 16:44:53 -04:00
Alireza Vaezi
7bc76d72f9 feat: Update camera configuration to support MP4 format with new settings
- Changed machine topic from "vibratory_conveyor" to "blower_separator" for camera1
- Updated exposure and gain settings for camera1
- Added new video recording settings: video_format, video_codec, video_quality, auto_start_recording_enabled, auto_recording_max_retries, auto_recording_retry_delay_seconds
- Enhanced documentation to reflect current configuration and API alignment
- Redesigned Camera Configuration UI to display read-only fields for system and auto-recording settings
- Improved handling of video format settings in the API and frontend
- Created CURRENT_CONFIGURATION.md for complete system configuration reference
2025-08-04 16:44:45 -04:00
Alireza Vaezi
1aaac68edd feat(video): Implement MP4 format support across frontend and backend
- Updated VideoModal to display web compatibility status for video formats.
- Enhanced VideoPlayer to dynamically fetch video MIME types and handle MP4 streaming.
- Introduced video file utilities for better handling of video formats and MIME types.
- Modified CameraConfig interface to include new video recording settings (format, codec, quality).
- Created comprehensive documentation for MP4 format integration and frontend implementation.
- Ensured backward compatibility with existing AVI files while promoting MP4 as the preferred format.
- Added validation and error handling for video format configurations.
2025-08-04 16:21:22 -04:00
Alireza Vaezi
551e5dc2e3 feat(video-streaming): Implement video streaming feature with components, hooks, services, and utilities
- Added centralized exports for video streaming components and hooks.
- Implemented `useVideoInfo` hook for fetching and managing video metadata and streaming information.
- Developed `useVideoList` hook for managing video list state, fetching, filtering, and pagination.
- Created `useVideoPlayer` hook for managing video player state and controls.
- Established `videoApiService` for handling API interactions related to video streaming.
- Defined TypeScript types for video streaming feature, including video metadata, API responses, and component props.
- Added utility functions for video operations, formatting, and data processing.
- Created main entry point for the video streaming feature, exporting all public APIs.
2025-08-04 15:02:48 -04:00
Alireza Vaezi
97f22d239d feat: Enhance camera streaming functionality with stop streaming feature and update UI for better user experience 2025-07-31 22:17:08 -04:00
Alireza Vaezi
28400fbfb8 Enhance camera configuration and auto-recording functionality
- Updated CameraStreamer to configure streaming settings from config.json, including manual exposure, gain, image quality, noise reduction, and color settings.
- Added new methods in CameraStreamer for configuring image quality, noise reduction, color settings, and advanced settings.
- Extended CameraConfig to include manual white balance RGB gains.
- Improved AutoRecordingManager to handle camera status updates and ensure proper recording starts/stops based on machine state changes.
- Created detailed configuration documentation for blower and conveyor cameras, outlining settings and their mappings to config.json.
- Implemented a comprehensive test script for auto-recording functionality with simulated MQTT messages, verifying correct behavior on machine state changes.
2025-07-29 13:54:16 -04:00
Alireza Vaezi
1f47e89a4d feat: Add camera preview functionality and recording controls to VisionSystem 2025-07-29 12:31:03 -04:00
Alireza Vaezi
0d20fe189d feat: Add CameraPreviewModal component for live camera streaming
feat: Implement useAuth hook for user authentication management

feat: Create useAutoRecording hook for managing automatic recording functionality

feat: Develop AutoRecordingManager to handle automatic recording based on MQTT events

test: Add test script to verify camera configuration API fix

test: Create HTML page for testing camera configuration API and auto-recording fields
2025-07-29 12:30:59 -04:00
Alireza Vaezi
a6514b72c9 Enhance API server configuration: add auto-recording settings and improve image quality parameters 2025-07-29 11:28:04 -04:00
Alireza Vaezi
7e3169f336 Remove compiled Python bytecode files from the __pycache__ directories 2025-07-29 11:27:58 -04:00
Alireza Vaezi
ff7cb2c8f3 Add comprehensive tests for camera streaming, time synchronization, and auto-recording functionality
- Implemented test script for camera streaming functionality, covering API endpoints and concurrent recording.
- Created time verification script to check system time synchronization against multiple APIs.
- Developed timezone utility tests to validate timezone functions and logging.
- Added integration tests for system components, including configuration, camera discovery, and API endpoints.
- Enhanced MQTT logging and API endpoint tests for machine and MQTT status.
- Established auto-recording tests to simulate state changes and verify automatic recording behavior.
- Created simple tests for auto-recording configuration and API model validation.
2025-07-29 11:15:10 -04:00
Alireza Vaezi
0c92b6c277 feat: Integrate auto-recording feature into USDA Vision Camera System
- Added instructions for implementing auto-recording functionality in the React app.
- Updated TypeScript interfaces to include new fields for auto-recording status and configuration.
- Created new API endpoints for enabling/disabling auto-recording and retrieving system status.
- Enhanced UI components to display auto-recording status, controls, and error handling.
- Developed a comprehensive Auto-Recording Feature Implementation Guide.
- Implemented a test script for validating auto-recording functionality, including configuration checks and API connectivity.
- Introduced AutoRecordingManager to manage automatic recording based on machine state changes with retry logic.
- Established a retry mechanism for failed recording attempts and integrated status tracking for auto-recording.
2025-07-29 09:43:14 -04:00
Alireza Vaezi
0a26a8046e Refine camera settings: adjust sharpness, contrast, gamma, noise filter, and auto white balance 2025-07-29 07:58:41 -04:00
Alireza Vaezi
ef0f9f85c5 Add USDA Vision Camera Streaming API and related functionality
- Implemented streaming API endpoints for starting, stopping, and retrieving live streams from cameras.
- Added support for concurrent streaming and recording operations.
- Created test scripts for frame conversion and streaming functionality.
- Developed a CameraStreamer class to manage live preview streaming without blocking recording.
- Included error handling and logging for camera operations.
- Added configuration endpoints for camera settings and real-time updates.
- Enhanced testing scenarios for various camera configurations and error handling.
2025-07-28 18:09:48 -04:00
Alireza Vaezi
104f6202fb feat(streaming): Add live streaming functionality for USDA Vision Camera system
- Introduced non-blocking live preview streaming that operates independently from recording.
- Implemented REST API endpoints for starting and stopping streams, and retrieving live streams.
- Developed a web interface (`camera_preview.html`) for users to control and view camera streams.
- Created TypeScript definitions for API integration in React projects.
- Added comprehensive testing script (`test_streaming.py`) to validate API endpoints and concurrent operations.
- Updated database migration to fix visibility of experiment repetitions for all authenticated users.
2025-07-28 17:53:59 -04:00
Alireza Vaezi
7bc8138f24 Add comprehensive test suite for USDA Vision Camera System
- Implemented main test script to verify system components and functionality.
- Added individual test scripts for camera exposure settings, API changes, camera recovery, maximum FPS, MQTT events, logging, and timezone functionality.
- Created service file for system management and automatic startup.
- Included detailed logging and error handling in test scripts for better diagnostics.
- Ensured compatibility with existing camera SDK and API endpoints.
2025-07-28 17:33:49 -04:00
Alireza Vaezi
d598281164 feat: Implement Vision System API Client with comprehensive endpoints and utility functions
- Added VisionApiClient class to interact with the vision system API.
- Defined interfaces for system status, machine status, camera status, recordings, and storage stats.
- Implemented methods for health checks, system status retrieval, camera control, and storage management.
- Introduced utility functions for formatting bytes, durations, and uptime.

test: Create manual verification script for Vision API functionality

- Added a test script to verify utility functions and API endpoints.
- Included tests for health check, system status, cameras, machines, and storage stats.

feat: Create experiment repetitions system migration

- Added experiment_repetitions table to manage experiment repetitions with scheduling.
- Implemented triggers and functions for validation and timestamp management.
- Established row-level security policies for user access control.

feat: Introduce phase-specific draft management system migration

- Created experiment_phase_drafts and experiment_phase_data tables for managing phase-specific drafts and measurements.
- Added pecan_diameter_measurements table for individual diameter measurements.
- Implemented row-level security policies for user access control.

fix: Adjust draft constraints to allow multiple drafts while preventing multiple submitted drafts

- Modified constraints on experiment_phase_drafts to allow multiple drafts in 'draft' or 'withdrawn' status.
- Ensured only one 'submitted' draft per user per phase per repetition.
2025-07-28 16:30:56 -04:00
Alireza Vaezi
9cb043ef5f feat: Add MQTT publisher and tester scripts for USDA Vision Camera System
- Implemented mqtt_publisher_test.py for manual MQTT message publishing
- Created mqtt_test.py to test MQTT message reception and display statistics
- Developed test_api_changes.py to verify API changes for camera settings and filename handling
- Added test_camera_recovery_api.py for testing camera recovery API endpoints
- Introduced test_max_fps.py to demonstrate maximum FPS capture functionality
- Implemented test_mqtt_events_api.py to test MQTT events API endpoint
- Created test_mqtt_logging.py for enhanced MQTT logging and API endpoint testing
- Added sdk_config.py for SDK initialization and configuration with error suppression
2025-07-28 16:30:14 -04:00
Alireza Vaezi
e2acebc056 holy shit a lot has changed! read the readme, I guess... 2025-07-28 16:29:41 -04:00
Alireza Vaezi
731d8cd9ff Enhance time synchronization checks, update storage paths, and improve camera recording management 2025-07-25 22:38:33 -04:00
Alireza Vaezi
69966519b0 Add project completion documentation for USDA Vision Camera System 2025-07-25 21:40:51 -04:00
Alireza Vaezi
381f51a3e6 Implement code changes to enhance functionality and improve performance 2025-07-25 21:39:52 -04:00
Alireza Vaezi
f6d6ba612e Massive update - API and other modules added 2025-07-25 21:39:07 -04:00
Alireza Vaezi
172f46d44d init commit 2025-07-25 12:07:30 -04:00
Alireza Vaezi
df42a0935d Initial commit 2025-07-25 12:06:17 -04:00
Alireza Vaezi
0d0c67d5c1 data entry and draft system work 2025-07-23 21:21:59 -04:00
Alireza Vaezi
511ed848a3 update dependencies and remove test credentials from Login component 2025-07-23 11:46:12 -04:00
Alireza Vaezi
4919efb845 add scheduling functionality for experiments with new ScheduleModal component 2025-07-20 21:07:58 -04:00
Alireza Vaezi
6797519b0a create new experiment works 2025-07-20 19:59:28 -04:00
Alireza Vaezi
bd0ae321de add new experiment 2025-07-20 17:04:51 -04:00
Alireza Vaezi
3ae77a2375 style improved 2025-07-20 12:06:13 -04:00
Alireza Vaezi
41d4654f9f style changed 2025-07-20 11:59:15 -04:00
Alireza Vaezi
b5848d9cba can successfully add new users 2025-07-20 11:33:51 -04:00
Alireza Vaezi
cfa8a0de81 css issue fixed 2025-07-20 11:10:52 -04:00
Alireza Vaezi
6a9ab6afaa RBAC seems to be working 2025-07-20 11:05:58 -04:00
Alireza Vaezi
033229989a Revert "RBAC in place. Tailwind CSS working."
This reverts commit 90d874b15f.
2025-07-18 21:18:07 -04:00
Alireza Vaezi
90d874b15f RBAC in place. Tailwind CSS working. 2025-07-17 12:10:23 -04:00
Alireza Vaezi
5fc7c89219 supabase added 2025-07-17 10:53:16 -04:00
Alireza Vaezi
6ee9371861 blank Vite + React + Tailwind CSS project 2025-07-17 10:44:08 -04:00
Alireza Vaezi
da7b6f7ae7 Initial commit 2025-07-17 10:39:47 -04:00
67 changed files with 5147 additions and 885 deletions

24
.env.azure.example Normal file
View File

@@ -0,0 +1,24 @@
# Microsoft Entra (Azure AD) OAuth Configuration for Self-Hosted Supabase
# Copy this file to your actual environment configuration and fill in the values
# Azure Application (Client) ID
# Get this from Azure Portal > App registrations > Your app > Overview
AZURE_CLIENT_ID=your-application-client-id-here
# Azure Client Secret
# Get this from Azure Portal > App registrations > Your app > Certificates & secrets
AZURE_CLIENT_SECRET=your-client-secret-value-here
# Azure Tenant ID or 'common'
# Options:
# - 'common': Multi-tenant (any Azure AD organization)
# - 'organizations': Any Azure AD organization (excludes personal accounts)
# - 'consumers': Personal Microsoft accounts only
# - Your specific tenant ID: Single-tenant (e.g., '12345678-1234-1234-1234-123456789012')
# Get tenant ID from Azure Portal > App registrations > Your app > Overview
AZURE_TENANT_ID=common
# Notes:
# 1. These variables are used in supabase/config.toml via env() substitution
# 2. Never commit this file with real secrets to git
# 3. After setting these, restart your Supabase services: docker-compose restart

View File

@@ -1,6 +1,6 @@
# Web environment (Vite)
VITE_SUPABASE_URL=http://exp-dash:54321
VITE_SUPABASE_ANON_KEY=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJzdXBhYmFzZS1kZW1vIiwicm9sZSI6ImFub24iLCJleHAiOjE5ODM4MTI5OTZ9.CRXP1A7WOeoJeXxjNni43kdQwgnWNReilDMblYTn_I0
VITE_SUPABASE_ANON_KEY=[REDACTED]
# API config (optional)
# MQTT_BROKER_HOST=mqtt

3
.envrc Normal file
View File

@@ -0,0 +1,3 @@
# Automatically load the Nix development shell when entering this directory
# Requires direnv: https://direnv.net/
use flake

52
.gitignore vendored
View File

@@ -4,7 +4,7 @@ __pycache__/
*.egg-info/
.venv/
.uv/
.env
*.env
.env.*.local
.pytest_cache/
.mypy_cache/
@@ -35,3 +35,53 @@ management-dashboard-web-app/users.txt
# Jupyter Notebooks
*.ipynb
# Nix
result
result-*
.direnv/
# Python
__pycache__/
*.py[cod]
*.egg-info/
.venv/
.uv/
*.env
.env.*.local
.host-ip
.pytest_cache/
.mypy_cache/
# Node / Vite
node_modules/
dist/
.build/
.vite/
*.log
# Editor/OS
.DS_Store
Thumbs.db
.vscode/
.idea/
# API storage & logs
camera-management-api/storage/
camera-management-api/usda_vision_system.log
# Docker
*.pid
camera-management-api/camera_sdk/
camera-management-api/core
management-dashboard-web-app/users.txt
# Jupyter Notebooks
*.ipynb
supabase/.temp/cli-latest
# Archive env backups (may contain secrets)
archive/management-dashboard-web-app/env-backups/
# Nix
result
result-*
.direnv/

264
FLAKE_SETUP.md Normal file
View File

@@ -0,0 +1,264 @@
# USDA Vision - Nix Flake Setup
This directory now has a Nix flake for building and developing the USDA Vision system, with ragenix for managing secrets.
## Quick Start
### Development Environment
Enter the development shell with all tools:
```bash
cd usda-vision
nix develop
```
This gives you:
- Docker & Docker Compose
- Node.js 20 with npm/pnpm
- Python 3.11 with pip/virtualenv
- Supabase CLI
- Camera SDK (automatically in `LD_LIBRARY_PATH`)
- ragenix for secrets management
- All standard utilities (jq, yq, rsync, etc.)
### Building
Build the package:
```bash
nix build
# Or explicitly:
nix build .#usda-vision
```
Build just the camera SDK:
```bash
nix build .#camera-sdk
```
## Secrets Management with ragenix
### Initial Setup
1. **Generate or use an age key**:
```bash
# Option 1: Generate a new age key
mkdir -p ~/.config/age
age-keygen -o ~/.config/age/keys.txt
# Option 2: Use your SSH key
ssh-to-age < ~/.ssh/id_ed25519.pub
# Copy the output to secrets/secrets.nix
```
2. **Add your public key** to [secrets/secrets.nix](secrets/secrets.nix):
```nix
{
publicKeys = [
"age1your_public_key_here"
# or
"ssh-ed25519 AAAA... user@host"
];
}
```
3. **Create encrypted environment files**:
```bash
nix develop # Enter dev shell first
ragenix -e secrets/env.age
```
This opens your `$EDITOR` to edit the encrypted file. Add your environment variables:
```bash
# Web environment (Vite)
VITE_SUPABASE_URL=http://exp-dash:54321
VITE_SUPABASE_ANON_KEY=your-anon-key-here
# ... etc
```
For Azure OAuth:
```bash
ragenix -e secrets/env.azure.age
```
### Using Secrets in Development
In the development shell, you can:
```bash
# Edit secrets
ragenix -e secrets/env.age
# View decrypted content (careful in shared screens!)
age -d -i ~/.config/age/keys.txt secrets/env.age
# Re-encrypt all secrets after adding a new public key
ragenix -r
```
### Using Secrets in Production (NixOS)
The flake includes a NixOS module that handles secrets automatically:
```nix
# In your NixOS configuration
{
inputs.usda-vision.url = "path:/path/to/usda-vision";
# ... in your module:
imports = [ inputs.usda-vision.nixosModules.default ];
services.usda-vision = {
enable = true;
secretsFile = config.age.secrets.usda-vision-env.path;
};
# Configure ragenix/agenix to decrypt the secrets
age.secrets.usda-vision-env = {
file = inputs.usda-vision + "/secrets/env.age";
mode = "0644";
owner = "root";
};
}
```
## Project Structure
```
usda-vision/
├── flake.nix # Flake definition with outputs
├── package.nix # Main application build
├── camera-sdk.nix # Camera SDK build
├── secrets.nix # ragenix configuration
├── secrets/
│ ├── secrets.nix # Public keys
│ ├── env.age # Encrypted .env (safe to commit)
│ ├── env.azure.age # Encrypted Azure config (safe to commit)
│ └── README.md # Secrets documentation
└── ... (rest of the app)
```
## Migration from Old Setup
### Old Workflow
- Manual `.env` file management
- Secrets in plaintext (git-ignored)
- Build defined in parent `default.nix`
### New Workflow
- Encrypted `.age` files in git
- Secrets managed with ragenix
- Self-contained flake in `usda-vision/`
- Development shell with all tools
### Migration Steps
1. **Encrypt existing `.env` files**:
```bash
cd usda-vision
nix develop
# Setup your age key first (see above)
# Encrypt the main .env
ragenix -e secrets/env.age
# Paste contents of old .env file, save and exit
# Encrypt Azure config
ragenix -e secrets/env.azure.age
# Paste contents of old .env.azure file, save and exit
```
2. **Delete unencrypted files** (they're git-ignored but still local):
```bash
rm .env .env.azure management-dashboard-web-app/.env
```
3. **Commit encrypted secrets**:
```bash
git add secrets/env.age secrets/env.azure.age secrets/secrets.nix
git commit -m "Add encrypted secrets with ragenix"
```
## Benefits
### Security
- ✅ Secrets encrypted at rest
- ✅ Safe to commit to git
- ✅ Key-based access control
- ✅ Audit trail (git history)
### Development
- ✅ Reproducible environment
- ✅ All tools included
- ✅ No manual setup
- ✅ Version-locked dependencies
### Deployment
- ✅ Declarative secrets management
- ✅ Automatic decryption on NixOS
- ✅ No manual key distribution
- ✅ Clean integration with existing infrastructure
## Common Tasks
### Add a new developer
1. They generate an age key or use their SSH key
2. They send you their public key
3. You add it to `secrets/secrets.nix`
4. Re-encrypt all secrets: `ragenix -r`
5. Commit and push
### Rotate a secret
1. Edit the encrypted file: `ragenix -e secrets/env.age`
2. Update the value
3. Save and exit
4. Commit: `git commit secrets/env.age -m "Rotate API key"`
### Build without flakes
If you need to build on a system without flakes enabled:
```bash
nix-build -E '(import (fetchTarball https://github.com/edolstra/flake-compat/archive/master.tar.gz) { src = ./.; }).defaultNix.default'
```
## Troubleshooting
### "error: getting status of '...': No such file or directory"
Make sure you're in the `usda-vision` directory when running `nix develop` or `nix build`.
### "cannot decrypt: no valid identity"
Your age private key isn't found. Check:
- `~/.config/age/keys.txt` exists
- Your public key is in `secrets/secrets.nix`
- You've run `ragenix -r` after adding your key
### "experimental feature 'flakes' not enabled"
Add to `~/.config/nix/nix.conf`:
```
experimental-features = nix-command flakes
```
Or run with: `nix --experimental-features 'nix-command flakes' develop`
## Further Reading
- [Nix Flakes](https://nixos.wiki/wiki/Flakes)
- [ragenix](https://github.com/yaxitech/ragenix)
- [age encryption](https://github.com/FiloSottile/age)

View File

@@ -7,6 +7,13 @@ A unified monorepo combining the camera API service and the web dashboard for US
- `camera-management-api/` - Python API service for camera management (USDA-Vision-Cameras)
- `management-dashboard-web-app/` - React web dashboard for experiment management (pecan_experiments)
- `supabase/` - Database configuration, migrations, and seed data (shared infrastructure)
- `media-api/` - Python service for video/thumbnail serving (port 8090)
- `video-remote/` - Frontend for video browsing (port 3001)
- `vision-system-remote/` - Camera/vision UI (port 3002)
- `scheduling-remote/` - Scheduling/availability UI (port 3003)
- `scripts/` - Host IP, RTSP checks, env helpers (see [scripts/README.md](scripts/README.md))
- `docs/` - Setup, Supabase, RTSP, design docs
- `mediamtx.yml` - RTSP/WebRTC config for MediaMTX streaming
## Quick Start
@@ -28,15 +35,15 @@ The wrapper script automatically:
For more details, see [Docker Compose Environment Setup](docs/DOCKER_COMPOSE_ENV_SETUP.md).
- Web: <http://localhost:5173>
- Web: <http://localhost:8080>
- API: <http://localhost:8000>
- MQTT broker: localhost:1883
- MQTT is optional; configure in API config if used (see `.env.example`).
To stop: `docker compose down`
### Development Mode (Recommended for Development)
For development with live logging, debugging, and hot reloading:
For development, use the same Docker Compose stack as production. The web app runs with the Vite dev server on port 8080 (hot reload); the API runs on port 8000.
1) Copy env template and set values (for web/Supabase):
@@ -45,36 +52,25 @@ cp .env.example .env
# set VITE_SUPABASE_URL and VITE_SUPABASE_ANON_KEY in .env
```
2) Start the development environment:
2) Start the stack (with logs in the foreground, or add `-d` for detached):
```bash
./dev-start.sh
./docker-compose.sh up --build
```
This will:
- Start containers with debug logging enabled
- Enable hot reloading for both API and web services
- Show all logs in real-time
- Keep containers running for debugging
**Development URLs:**
- Web: <http://localhost:8080> (with hot reloading)
- API: <http://localhost:8000> (with debug logging)
- Web: <http://localhost:8080> (Vite dev server with hot reload)
- API: <http://localhost:8000>
**Development Commands:**
- `./dev-start.sh` - Start development environment
- `./dev-stop.sh` - Stop development environment
- `./dev-logs.sh` - View logs (use `-f` to follow, `-t N` for last N lines)
- `./dev-logs.sh -f api` - Follow API logs only
- `./dev-logs.sh -f web` - Follow web logs only
- `./dev-shell.sh` - Open shell in API container
- `./dev-shell.sh web` - Open shell in web container
**Debug Features:**
- API runs with `--debug --verbose` flags for maximum logging
- Web runs with Vite dev server for hot reloading
- All containers have `stdin_open: true` and `tty: true` for debugging
- Environment variables set for development mode
- `./docker-compose.sh up --build` - Start stack (omit `-d` to see logs)
- `./docker-compose.sh up --build -d` - Start stack in background
- `docker compose down` - Stop all services
- `docker compose logs -f` - Follow all logs
- `docker compose logs -f api` - Follow API logs only
- `docker compose logs -f web` - Follow web logs only
- `docker compose exec api sh` - Open shell in API container
- `docker compose exec web sh` - Open shell in web container
## Services
@@ -84,16 +80,34 @@ This will:
- Video recording controls
- File management
### Web Dashboard (Port 5173)
### Web Dashboard (Port 8080)
- User authentication via Supabase
- Experiment definition and management
- Camera control interface
- Video playback and analysis
### MQTT Broker (Port 1883)
### Media API (Port 8090)
- Local Mosquitto broker for development and integration testing
- Video listing, thumbnails, transcoding
### Video Remote (Port 3001)
- Video browser UI
### Vision System Remote (Port 3002)
- Camera/vision control UI
### Scheduling Remote (Port 3003)
- Scheduling/availability UI
### MediaMTX (Ports 8554, 8889, 8189)
- RTSP and WebRTC streaming (config: [mediamtx.yml](mediamtx.yml))
Supabase services are currently commented out in `docker-compose.yml` and can be run via Supabase CLI (e.g. from `management-dashboard-web-app`). See [docs](docs/) for setup.
## Git Subtree Workflow
@@ -138,7 +152,7 @@ Notes:
- Storage (recordings) is mapped to `camera-management-api/storage/` and ignored by git.
- Web
- Code lives under `management-dashboard-web-app/` with a Vite dev server on port 5173.
- Code lives under `management-dashboard-web-app/` with a Vite dev server on port 8080 when run via Docker.
- Environment: set `VITE_SUPABASE_URL` and `VITE_SUPABASE_ANON_KEY` in `.env` (not committed).
- Common scripts: `npm run dev`, `npm run build` (executed inside the container by compose).

241
SETUP_COMPLETE.md Normal file
View File

@@ -0,0 +1,241 @@
# USDA Vision - Flake Migration Complete ✅
## Summary
Your USDA Vision repository now has:
1. **Self-contained Nix flake** (`flake.nix`)
- Independent build system
- Development environment
- NixOS module for deployment
2. **Encrypted secrets management** (ragenix)
- `.age` files safe to commit to git
- Key-based access control
- No more plaintext `.env` files
3. **Modular build** (package.nix, camera-sdk.nix)
- Cleaner organization
- Easier to maintain
- Reusable components
4. **Updated parent** (../default.nix)
- Now references the flake
- Removed 200+ lines of inline derivations
## Files Added
### Core Flake Files
-`flake.nix` - Main flake definition with outputs
-`package.nix` - Application build logic
-`camera-sdk.nix` - Camera SDK build logic
-`secrets.nix` - ragenix configuration
### Secrets Infrastructure
-`secrets/secrets.nix` - Public key list
-`secrets/README.md` - Secrets documentation
-`secrets/.gitignore` - Protect plaintext files
### Documentation & Helpers
-`FLAKE_SETUP.md` - Complete setup guide
-`setup-dev.sh` - Interactive setup script
-`.envrc` - direnv integration (optional)
### Parent Directory
-`NIX_FLAKE_MIGRATION.md` - Migration summary
## Next Steps
### 1. Commit the Flake Files
The flake needs to be in git to work:
```bash
cd /home/engr-ugaif/usda-dash-config/usda-vision
# Add all new flake files
git add flake.nix package.nix camera-sdk.nix secrets.nix
git add secrets/secrets.nix secrets/README.md secrets/.gitignore
git add FLAKE_SETUP.md setup-dev.sh .envrc .gitignore
# Commit
git commit -m "Add Nix flake with ragenix secrets management
- Self-contained flake build system
- Development shell with all tools
- ragenix for encrypted secrets
- Modular package definitions
"
```
### 2. Set Up Your Age Key
```bash
cd /home/engr-ugaif/usda-dash-config/usda-vision
# Option A: Use the interactive setup script
./setup-dev.sh
# Option B: Manual setup
mkdir -p ~/.config/age
age-keygen -o ~/.config/age/keys.txt
# Then add your public key to secrets/secrets.nix
```
### 3. Encrypt Your Secrets
```bash
# Enter the development environment
nix develop
# Encrypt main .env file
ragenix -e secrets/env.age
# Paste your current .env contents, save, exit
# Encrypt Azure config
ragenix -e secrets/env.azure.age
# Paste your current .env.azure contents, save, exit
# Commit encrypted secrets
git add secrets/env.age secrets/env.azure.age
git commit -m "Add encrypted environment configuration"
```
### 4. Test the Setup
```bash
# Test that the build works
nix build
# Test the development shell
nix develop
# You should see a welcome message
# Inside the dev shell, verify tools
docker-compose --version
supabase --version
ragenix --help
```
### 5. Update the Parent Repository
```bash
cd /home/engr-ugaif/usda-dash-config
# Commit the updated default.nix
git add default.nix NIX_FLAKE_MIGRATION.md
git commit -m "Update default.nix to use usda-vision flake
- Removed inline derivations
- Now references usda-vision flake packages
- Cleaner, more maintainable code
"
```
### 6. Clean Up Old Files (Optional)
After verifying everything works, you can delete the old plaintext secrets:
```bash
cd /home/engr-ugaif/usda-dash-config/usda-vision
# These are already git-ignored, but remove them locally
rm -f .env .env.azure management-dashboard-web-app/.env
echo "✅ Old plaintext secrets removed"
```
## Verification Checklist
- [ ] Flake files committed to git
- [ ] Age key generated at `~/.config/age/keys.txt`
- [ ] Public key added to `secrets/secrets.nix`
- [ ] Secrets encrypted and committed
- [ ] `nix build` succeeds
- [ ] `nix develop` works
- [ ] Parent `default.nix` updated and committed
- [ ] Old `.env` files deleted
## Usage Quick Reference
### Development
```bash
# Enter dev environment (one-time per session)
cd usda-vision
nix develop
# Edit secrets
ragenix -e secrets/env.age
# Normal docker-compose workflow
docker-compose up -d
docker-compose logs -f
```
### Building
```bash
# Build everything
nix build
# Build specific packages
nix build .#usda-vision
nix build .#camera-sdk
```
### Secrets Management
```bash
# Edit encrypted secret
ragenix -e secrets/env.age
# Re-key after adding a new public key
ragenix -r
# View decrypted (careful!)
age -d -i ~/.config/age/keys.txt secrets/env.age
```
## Troubleshooting
### "cannot decrypt: no valid identity"
Your age key isn't configured. Run:
```bash
./setup-dev.sh
```
### "error: flake.nix is not in git"
Commit the flake files:
```bash
git add flake.nix package.nix camera-sdk.nix secrets.nix
git commit -m "Add flake files"
```
### "experimental feature 'flakes' not enabled"
Add to `~/.config/nix/nix.conf`:
```
experimental-features = nix-command flakes
```
## Documentation
- **Full Setup Guide**: [FLAKE_SETUP.md](FLAKE_SETUP.md)
- **Secrets Guide**: [secrets/README.md](secrets/README.md)
- **Migration Summary**: [../NIX_FLAKE_MIGRATION.md](../NIX_FLAKE_MIGRATION.md)
## Questions?
Refer to [FLAKE_SETUP.md](FLAKE_SETUP.md) for detailed documentation, or run:
```bash
./setup-dev.sh # Interactive setup
```
---
**Migration completed on**: 2026-01-30
**Created by**: GitHub Copilot

View File

@@ -0,0 +1,7 @@
# Archive: management-dashboard-web-app legacy/backup files
Moved from `management-dashboard-web-app/` so the app directory only contains active code and config.
- **env-backups/** Old `.env.backup` and timestamped backup (Supabase URL/key). Keep out of version control if they contain secrets.
- **experiment-data/** CSV run sheets: `phase_2_JC_experimental_run_sheet.csv`, `post_workshop_meyer_experiments.csv`. Source/reference data for experiments.
- **test-api-fix.js** One-off test script for camera config API; not part of the app build.

View File

@@ -0,0 +1,61 @@
experiment_number,soaking_duration_hr,air_drying_duration_min,plate_contact_frequency_hz,throughput_rate_pecans_sec,crush_amount_in,entry_exit_height_diff_in
0,34,19,53,28,0.05,-0.09
1,24,27,34,29,0.03,0.01
12,28,59,37,23,0.06,-0.08
15,16,60,30,24,0.07,0.02
4,13,41,41,38,0.05,0.03
18,18,49,38,35,0.07,-0.08
11,24,59,42,25,0.07,-0.05
16,20,59,41,14,0.07,0.04
4,13,41,41,38,0.05,0.03
19,11,25,56,34,0.06,-0.09
15,16,60,30,24,0.07,0.02
16,20,59,41,14,0.07,0.04
10,26,60,44,12,0.08,-0.1
1,24,27,34,29,0.03,0.01
17,34,60,34,29,0.07,-0.09
5,30,33,30,36,0.05,-0.04
2,38,10,60,28,0.06,-0.1
2,38,10,60,28,0.06,-0.1
13,21,59,41,21,0.06,-0.09
1,24,27,34,29,0.03,0.01
14,22,59,45,17,0.07,-0.08
6,10,22,37,30,0.06,0.02
11,24,59,42,25,0.07,-0.05
19,11,25,56,34,0.06,-0.09
8,27,12,55,24,0.04,0.04
18,18,49,38,35,0.07,-0.08
5,30,33,30,36,0.05,-0.04
9,32,26,47,26,0.07,0.03
3,11,36,42,13,0.07,-0.07
10,26,60,44,12,0.08,-0.1
8,27,12,55,24,0.04,0.04
5,30,33,30,36,0.05,-0.04
8,27,12,55,24,0.04,0.04
18,18,49,38,35,0.07,-0.08
3,11,36,42,13,0.07,-0.07
10,26,60,44,12,0.08,-0.1
17,34,60,34,29,0.07,-0.09
13,21,59,41,21,0.06,-0.09
12,28,59,37,23,0.06,-0.08
9,32,26,47,26,0.07,0.03
14,22,59,45,17,0.07,-0.08
0,34,19,53,28,0.05,-0.09
7,15,30,35,32,0.05,-0.07
0,34,19,53,28,0.05,-0.09
15,16,60,30,24,0.07,0.02
13,21,59,41,21,0.06,-0.09
11,24,59,42,25,0.07,-0.05
7,15,30,35,32,0.05,-0.07
16,20,59,41,14,0.07,0.04
3,11,36,42,13,0.07,-0.07
7,15,30,35,32,0.05,-0.07
6,10,22,37,30,0.06,0.02
19,11,25,56,34,0.06,-0.09
6,10,22,37,30,0.06,0.02
2,38,10,60,28,0.06,-0.1
14,22,59,45,17,0.07,-0.08
4,13,41,41,38,0.05,0.03
9,32,26,47,26,0.07,0.03
17,34,60,34,29,0.07,-0.09
12,28,59,37,23,0.06,-0.08
1 experiment_number soaking_duration_hr air_drying_duration_min plate_contact_frequency_hz throughput_rate_pecans_sec crush_amount_in entry_exit_height_diff_in
2 0 34 19 53 28 0.05 -0.09
3 1 24 27 34 29 0.03 0.01
4 12 28 59 37 23 0.06 -0.08
5 15 16 60 30 24 0.07 0.02
6 4 13 41 41 38 0.05 0.03
7 18 18 49 38 35 0.07 -0.08
8 11 24 59 42 25 0.07 -0.05
9 16 20 59 41 14 0.07 0.04
10 4 13 41 41 38 0.05 0.03
11 19 11 25 56 34 0.06 -0.09
12 15 16 60 30 24 0.07 0.02
13 16 20 59 41 14 0.07 0.04
14 10 26 60 44 12 0.08 -0.1
15 1 24 27 34 29 0.03 0.01
16 17 34 60 34 29 0.07 -0.09
17 5 30 33 30 36 0.05 -0.04
18 2 38 10 60 28 0.06 -0.1
19 2 38 10 60 28 0.06 -0.1
20 13 21 59 41 21 0.06 -0.09
21 1 24 27 34 29 0.03 0.01
22 14 22 59 45 17 0.07 -0.08
23 6 10 22 37 30 0.06 0.02
24 11 24 59 42 25 0.07 -0.05
25 19 11 25 56 34 0.06 -0.09
26 8 27 12 55 24 0.04 0.04
27 18 18 49 38 35 0.07 -0.08
28 5 30 33 30 36 0.05 -0.04
29 9 32 26 47 26 0.07 0.03
30 3 11 36 42 13 0.07 -0.07
31 10 26 60 44 12 0.08 -0.1
32 8 27 12 55 24 0.04 0.04
33 5 30 33 30 36 0.05 -0.04
34 8 27 12 55 24 0.04 0.04
35 18 18 49 38 35 0.07 -0.08
36 3 11 36 42 13 0.07 -0.07
37 10 26 60 44 12 0.08 -0.1
38 17 34 60 34 29 0.07 -0.09
39 13 21 59 41 21 0.06 -0.09
40 12 28 59 37 23 0.06 -0.08
41 9 32 26 47 26 0.07 0.03
42 14 22 59 45 17 0.07 -0.08
43 0 34 19 53 28 0.05 -0.09
44 7 15 30 35 32 0.05 -0.07
45 0 34 19 53 28 0.05 -0.09
46 15 16 60 30 24 0.07 0.02
47 13 21 59 41 21 0.06 -0.09
48 11 24 59 42 25 0.07 -0.05
49 7 15 30 35 32 0.05 -0.07
50 16 20 59 41 14 0.07 0.04
51 3 11 36 42 13 0.07 -0.07
52 7 15 30 35 32 0.05 -0.07
53 6 10 22 37 30 0.06 0.02
54 19 11 25 56 34 0.06 -0.09
55 6 10 22 37 30 0.06 0.02
56 2 38 10 60 28 0.06 -0.1
57 14 22 59 45 17 0.07 -0.08
58 4 13 41 41 38 0.05 0.03
59 9 32 26 47 26 0.07 0.03
60 17 34 60 34 29 0.07 -0.09
61 12 28 59 37 23 0.06 -0.08

View File

@@ -0,0 +1,41 @@
phase_name,machine_type,Motor Speed (Hz),soaking_duration_hr,air_drying_duration_min,jig Displacement (in),Spring Stiffness (N/m),reps_required,rep
"Post Workshop Meyer Experiments","Meyer Cracker",33,27,28,-0.307,1800,1,1
"Post Workshop Meyer Experiments","Meyer Cracker",30,37,17,-0.311,2000,1,1
"Post Workshop Meyer Experiments","Meyer Cracker",47,36,50,-0.291,1800,1,1
"Post Workshop Meyer Experiments","Meyer Cracker",42,12,30,-0.314,2000,1,1
"Post Workshop Meyer Experiments","Meyer Cracker",53,34,19,-0.302,1800,1,1
"Post Workshop Meyer Experiments","Meyer Cracker",37,18,40,-0.301,2200,1,1
"Post Workshop Meyer Experiments","Meyer Cracker",40,14,59,-0.286,2000,1,1
"Post Workshop Meyer Experiments","Meyer Cracker",39,18,32,-0.309,1800,1,1
"Post Workshop Meyer Experiments","Meyer Cracker",49,11,31,-0.299,2200,1,1
"Post Workshop Meyer Experiments","Meyer Cracker",47,33,12,-0.295,2000,1,1
"Post Workshop Meyer Experiments","Meyer Cracker",52,23,36,-0.302,2000,1,1
"Post Workshop Meyer Experiments","Meyer Cracker",59,37,35,-0.299,1800,1,1
"Post Workshop Meyer Experiments","Meyer Cracker",41,15,15,-0.312,2000,1,1
"Post Workshop Meyer Experiments","Meyer Cracker",46,24,22,-0.303,1800,1,1
"Post Workshop Meyer Experiments","Meyer Cracker",50,36,15,-0.308,1800,1,1
"Post Workshop Meyer Experiments","Meyer Cracker",36,32,48,-0.306,2200,1,1
"Post Workshop Meyer Experiments","Meyer Cracker",33,28,38,-0.308,2200,1,1
"Post Workshop Meyer Experiments","Meyer Cracker",35,31,51,-0.311,1800,1,1
"Post Workshop Meyer Experiments","Meyer Cracker",55,20,57,-0.304,2000,1,1
"Post Workshop Meyer Experiments","Meyer Cracker",44,10,27,-0.313,2200,1,1
"Post Workshop Meyer Experiments","Meyer Cracker",37,16,43,-0.294,2000,1,1
"Post Workshop Meyer Experiments","Meyer Cracker",56,25,42,-0.31,2200,1,1
"Post Workshop Meyer Experiments","Meyer Cracker",30,13,21,-0.292,2200,1,1
"Post Workshop Meyer Experiments","Meyer Cracker",60,29,46,-0.294,2200,1,1
"Post Workshop Meyer Experiments","Meyer Cracker",41,21,54,-0.306,2000,1,1
"Post Workshop Meyer Experiments","Meyer Cracker",55,29,54,-0.296,1800,1,1
"Post Workshop Meyer Experiments","Meyer Cracker",39,30,48,-0.293,2200,1,1
"Post Workshop Meyer Experiments","Meyer Cracker",34,35,53,-0.285,2200,1,1
"Post Workshop Meyer Experiments","Meyer Cracker",57,32,39,-0.291,1800,1,1
"Post Workshop Meyer Experiments","Meyer Cracker",45,27,38,-0.296,2200,1,1
"Post Workshop Meyer Experiments","Meyer Cracker",52,17,25,-0.297,1800,1,1
"Post Workshop Meyer Experiments","Meyer Cracker",51,13,22,-0.288,2200,1,1
"Post Workshop Meyer Experiments","Meyer Cracker",36,19,11,-0.29,2000,1,1
"Post Workshop Meyer Experiments","Meyer Cracker",44,38,32,-0.315,1800,1,1
"Post Workshop Meyer Experiments","Meyer Cracker",58,26,18,-0.289,1800,1,1
"Post Workshop Meyer Experiments","Meyer Cracker",32,22,52,-0.288,1800,1,1
"Post Workshop Meyer Experiments","Meyer Cracker",43,12,56,-0.287,2200,1,1
"Post Workshop Meyer Experiments","Meyer Cracker",60,16,45,-0.298,2200,1,1
"Post Workshop Meyer Experiments","Meyer Cracker",54,22,25,-0.301,2000,1,1
"Post Workshop Meyer Experiments","Meyer Cracker",48,24,13,-0.305,2000,1,1
1 phase_name machine_type Motor Speed (Hz) soaking_duration_hr air_drying_duration_min jig Displacement (in) Spring Stiffness (N/m) reps_required rep
2 Post Workshop Meyer Experiments Meyer Cracker 33 27 28 -0.307 1800 1 1
3 Post Workshop Meyer Experiments Meyer Cracker 30 37 17 -0.311 2000 1 1
4 Post Workshop Meyer Experiments Meyer Cracker 47 36 50 -0.291 1800 1 1
5 Post Workshop Meyer Experiments Meyer Cracker 42 12 30 -0.314 2000 1 1
6 Post Workshop Meyer Experiments Meyer Cracker 53 34 19 -0.302 1800 1 1
7 Post Workshop Meyer Experiments Meyer Cracker 37 18 40 -0.301 2200 1 1
8 Post Workshop Meyer Experiments Meyer Cracker 40 14 59 -0.286 2000 1 1
9 Post Workshop Meyer Experiments Meyer Cracker 39 18 32 -0.309 1800 1 1
10 Post Workshop Meyer Experiments Meyer Cracker 49 11 31 -0.299 2200 1 1
11 Post Workshop Meyer Experiments Meyer Cracker 47 33 12 -0.295 2000 1 1
12 Post Workshop Meyer Experiments Meyer Cracker 52 23 36 -0.302 2000 1 1
13 Post Workshop Meyer Experiments Meyer Cracker 59 37 35 -0.299 1800 1 1
14 Post Workshop Meyer Experiments Meyer Cracker 41 15 15 -0.312 2000 1 1
15 Post Workshop Meyer Experiments Meyer Cracker 46 24 22 -0.303 1800 1 1
16 Post Workshop Meyer Experiments Meyer Cracker 50 36 15 -0.308 1800 1 1
17 Post Workshop Meyer Experiments Meyer Cracker 36 32 48 -0.306 2200 1 1
18 Post Workshop Meyer Experiments Meyer Cracker 33 28 38 -0.308 2200 1 1
19 Post Workshop Meyer Experiments Meyer Cracker 35 31 51 -0.311 1800 1 1
20 Post Workshop Meyer Experiments Meyer Cracker 55 20 57 -0.304 2000 1 1
21 Post Workshop Meyer Experiments Meyer Cracker 44 10 27 -0.313 2200 1 1
22 Post Workshop Meyer Experiments Meyer Cracker 37 16 43 -0.294 2000 1 1
23 Post Workshop Meyer Experiments Meyer Cracker 56 25 42 -0.31 2200 1 1
24 Post Workshop Meyer Experiments Meyer Cracker 30 13 21 -0.292 2200 1 1
25 Post Workshop Meyer Experiments Meyer Cracker 60 29 46 -0.294 2200 1 1
26 Post Workshop Meyer Experiments Meyer Cracker 41 21 54 -0.306 2000 1 1
27 Post Workshop Meyer Experiments Meyer Cracker 55 29 54 -0.296 1800 1 1
28 Post Workshop Meyer Experiments Meyer Cracker 39 30 48 -0.293 2200 1 1
29 Post Workshop Meyer Experiments Meyer Cracker 34 35 53 -0.285 2200 1 1
30 Post Workshop Meyer Experiments Meyer Cracker 57 32 39 -0.291 1800 1 1
31 Post Workshop Meyer Experiments Meyer Cracker 45 27 38 -0.296 2200 1 1
32 Post Workshop Meyer Experiments Meyer Cracker 52 17 25 -0.297 1800 1 1
33 Post Workshop Meyer Experiments Meyer Cracker 51 13 22 -0.288 2200 1 1
34 Post Workshop Meyer Experiments Meyer Cracker 36 19 11 -0.29 2000 1 1
35 Post Workshop Meyer Experiments Meyer Cracker 44 38 32 -0.315 1800 1 1
36 Post Workshop Meyer Experiments Meyer Cracker 58 26 18 -0.289 1800 1 1
37 Post Workshop Meyer Experiments Meyer Cracker 32 22 52 -0.288 1800 1 1
38 Post Workshop Meyer Experiments Meyer Cracker 43 12 56 -0.287 2200 1 1
39 Post Workshop Meyer Experiments Meyer Cracker 60 16 45 -0.298 2200 1 1
40 Post Workshop Meyer Experiments Meyer Cracker 54 22 25 -0.301 2000 1 1
41 Post Workshop Meyer Experiments Meyer Cracker 48 24 13 -0.305 2000 1 1

View File

@@ -0,0 +1,132 @@
// Test script to verify the camera configuration API fix
// This simulates the VisionApiClient.getCameraConfig method
class TestVisionApiClient {
constructor() {
this.baseUrl = 'http://localhost:8000'
}
async request(endpoint) {
const response = await fetch(`${this.baseUrl}${endpoint}`, {
headers: {
'Content-Type': 'application/json',
}
})
if (!response.ok) {
const errorText = await response.text()
throw new Error(`HTTP ${response.status}: ${response.statusText}\n${errorText}`)
}
return response.json()
}
// This is our fixed getCameraConfig method
async getCameraConfig(cameraName) {
try {
const config = await this.request(`/cameras/${cameraName}/config`)
// Ensure auto-recording fields have default values if missing
return {
...config,
auto_start_recording_enabled: config.auto_start_recording_enabled ?? false,
auto_recording_max_retries: config.auto_recording_max_retries ?? 3,
auto_recording_retry_delay_seconds: config.auto_recording_retry_delay_seconds ?? 5
}
} catch (error) {
// If the error is related to missing auto-recording fields, try to handle it gracefully
if (error.message?.includes('auto_start_recording_enabled') ||
error.message?.includes('auto_recording_max_retries') ||
error.message?.includes('auto_recording_retry_delay_seconds')) {
// Try to get the raw camera data and add default auto-recording fields
try {
const response = await fetch(`${this.baseUrl}/cameras/${cameraName}/config`, {
headers: {
'Content-Type': 'application/json',
}
})
if (!response.ok) {
throw new Error(`HTTP ${response.status}: ${response.statusText}`)
}
const rawConfig = await response.json()
// Add missing auto-recording fields with defaults
return {
...rawConfig,
auto_start_recording_enabled: false,
auto_recording_max_retries: 3,
auto_recording_retry_delay_seconds: 5
}
} catch (fallbackError) {
throw new Error(`Failed to load camera configuration: ${error.message}`)
}
}
throw error
}
}
async getCameras() {
return this.request('/cameras')
}
}
// Test function
async function testCameraConfigFix() {
console.log('🧪 Testing Camera Configuration API Fix')
console.log('=' * 50)
const api = new TestVisionApiClient()
try {
// First get available cameras
console.log('📋 Getting camera list...')
const cameras = await api.getCameras()
const cameraNames = Object.keys(cameras)
if (cameraNames.length === 0) {
console.log('❌ No cameras found')
return
}
console.log(`✅ Found ${cameraNames.length} cameras: ${cameraNames.join(', ')}`)
// Test configuration for each camera
for (const cameraName of cameraNames) {
console.log(`\n🎥 Testing configuration for ${cameraName}...`)
try {
const config = await api.getCameraConfig(cameraName)
console.log(`✅ Configuration loaded successfully for ${cameraName}`)
console.log(` - auto_start_recording_enabled: ${config.auto_start_recording_enabled}`)
console.log(` - auto_recording_max_retries: ${config.auto_recording_max_retries}`)
console.log(` - auto_recording_retry_delay_seconds: ${config.auto_recording_retry_delay_seconds}`)
console.log(` - exposure_ms: ${config.exposure_ms}`)
console.log(` - gain: ${config.gain}`)
} catch (error) {
console.log(`❌ Configuration failed for ${cameraName}: ${error.message}`)
}
}
console.log('\n🎉 Camera configuration API test completed!')
} catch (error) {
console.log(`❌ Test failed: ${error.message}`)
}
}
// Export for use in browser console or Node.js
if (typeof module !== 'undefined' && module.exports) {
module.exports = { TestVisionApiClient, testCameraConfigFix }
} else {
// Browser environment
window.TestVisionApiClient = TestVisionApiClient
window.testCameraConfigFix = testCameraConfigFix
}
console.log('📝 Test script loaded. Run testCameraConfigFix() to test the fix.')

44
camera-sdk.nix Normal file
View File

@@ -0,0 +1,44 @@
{ stdenv
, lib
, makeWrapper
, libusb1
}:
stdenv.mkDerivation {
pname = "mindvision-camera-sdk";
version = "2.1.0.49";
# Use the camera_sdk directory as source
src = ./camera-management-api/camera_sdk;
nativeBuildInputs = [ makeWrapper ];
buildInputs = [ libusb1 ];
unpackPhase = ''
cp -r $src/* .
tar xzf "linuxSDK_V2.1.0.49(250108).tar.gz"
cd "linuxSDK_V2.1.0.49(250108)"
'';
installPhase = ''
mkdir -p $out/lib $out/include
# Copy x64 library files (SDK has arch-specific subdirs)
if [ -d lib/x64 ]; then
cp -r lib/x64/* $out/lib/ || true
fi
# Copy header files
if [ -d include ]; then
cp -r include/* $out/include/ || true
fi
# Make libraries executable
chmod +x $out/lib/*.so* 2>/dev/null || true
'';
meta = with lib; {
description = "MindVision Camera SDK";
platforms = platforms.linux;
};
}

View File

@@ -2,10 +2,10 @@ networks:
usda-vision-network:
driver: bridge
volumes:
supabase-db:
driver: local
supabase-storage:
# volumes:
# supabase-db:
# driver: local
# supabase-storage:
services:
# ============================================================================
@@ -17,7 +17,7 @@ services:
# - Filter by label: docker compose ps --filter "label=com.usda-vision.service=supabase"
# - Or use service names: docker compose ps supabase-*
#
# NOTE: Currently commented out to test Supabase CLI setup from management-dashboard-web-app
# NOTE: Supabase CLI and docker-compose use root supabase/
# # # Supabase Database
# # supabase-db:
@@ -400,6 +400,8 @@ services:
video-remote:
container_name: usda-vision-video-remote
image: node:20-alpine
tty: true
stdin_open: true
working_dir: /app
environment:
- CHOKIDAR_USEPOLLING=true
@@ -424,6 +426,8 @@ services:
vision-system-remote:
container_name: usda-vision-vision-system-remote
image: node:20-alpine
tty: true
stdin_open: true
working_dir: /app
environment:
- CHOKIDAR_USEPOLLING=true
@@ -447,6 +451,8 @@ services:
scheduling-remote:
container_name: usda-vision-scheduling-remote
image: node:20-alpine
tty: true
stdin_open: true
working_dir: /app
env_file:
- ./management-dashboard-web-app/.env
@@ -466,6 +472,14 @@ services:
- "3003:3003"
networks:
- usda-vision-network
develop:
watch:
- path: ./scheduling-remote
action: restart
ignore:
- node_modules/
- dist/
- .git/
media-api:
container_name: usda-vision-media-api

View File

@@ -7,6 +7,7 @@ Your current design has a **fundamental flaw** that prevents it from working cor
### The Problem
The phase tables (`soaking`, `airdrying`, `cracking`, `shelling`) have this constraint:
```sql
CONSTRAINT unique_soaking_per_experiment UNIQUE (experiment_id)
```
@@ -16,7 +17,8 @@ This means you can **only have ONE soaking record per experiment**, even if you
### Why This Happens
When you create an experiment with 3 repetitions:
1. ✅ 3 rows are created in `experiment_repetitions`
1. ✅ 3 rows are created in `experiment_repetitions`
2. ❌ But you can only create 1 row in `soaking` (due to UNIQUE constraint)
3. ❌ The other 2 repetitions cannot have soaking data!
@@ -25,6 +27,7 @@ When you create an experiment with 3 repetitions:
### Current Approach: Separate Tables (❌ Not Recommended)
**Problems:**
- ❌ UNIQUE constraint breaks repetitions
- ❌ Schema duplication (same structure 4 times)
- ❌ Hard to query "all phases for a repetition"
@@ -34,6 +37,7 @@ When you create an experiment with 3 repetitions:
### Recommended Approach: Unified Table (✅ Best Practice)
**Benefits:**
- ✅ Properly supports repetitions (one phase per repetition)
- ✅ Automatic phase creation via database trigger
- ✅ Simple sequential time calculations
@@ -45,7 +49,7 @@ When you create an experiment with 3 repetitions:
I've created a migration file that implements a **unified `experiment_phase_executions` table**:
### Key Features:
### Key Features
1. **Single Table for All Phases**
- Uses `phase_type` enum to distinguish phases
@@ -68,7 +72,8 @@ I've created a migration file that implements a **unified `experiment_phase_exec
## Files Created
1. **`docs/database_design_analysis.md`** - Detailed analysis with comparison matrix
2. **`management-dashboard-web-app/supabase/migrations/00012_unified_phase_executions.sql`** - Complete migration implementation
3. **`supabase/migrations/00012_unified_phase_executions.sql`** - Complete migration implementation
## Migration Path
@@ -81,6 +86,7 @@ I've created a migration file that implements a **unified `experiment_phase_exec
## Alternative: Fix Current Design
If you prefer to keep separate tables, you MUST:
1. Remove `UNIQUE (experiment_id)` constraints from all phase tables
2. Keep only `UNIQUE (repetition_id)` constraints
3. Add trigger to auto-create phase entries when repetitions are created
@@ -91,4 +97,3 @@ However, this still has the drawbacks of schema duplication and complexity.
## Recommendation
**Use the unified table approach** - it's cleaner, more maintainable, and properly supports your repetition model.

View File

@@ -0,0 +1,138 @@
# Microsoft Entra OpenID Connect - Quick Start
## What Was Implemented
The Login flow now supports Microsoft Entra ID (Azure AD) authentication via OpenID Connect using Supabase's Azure OAuth provider.
## Files Modified
1. **[Login.tsx](management-dashboard-web-app/src/components/Login.tsx)**: Added Microsoft login button and OAuth flow
2. **[MicrosoftIcon.tsx](management-dashboard-web-app/src/components/MicrosoftIcon.tsx)**: Created Microsoft logo component
3. **[vite-env.d.ts](management-dashboard-web-app/src/vite-env.d.ts)**: Added TypeScript type for new environment variable
4. **[.env.example](management-dashboard-web-app/.env.example)**: Added Microsoft login configuration
## How It Works
### User Experience
1. User visits login page and sees two options:
- Traditional email/password login (existing)
- "Sign in with Microsoft" button (new)
2. Clicking Microsoft button redirects to Microsoft login
3. User authenticates with Microsoft credentials
4. Microsoft redirects back to app with authenticated session
### Technical Flow
```
User clicks "Sign in with Microsoft"
Supabase signInWithOAuth() with provider: 'azure'
Redirect to Microsoft Entra login page
User authenticates with Microsoft
Microsoft redirects to Supabase callback URL
Supabase validates and creates session
User redirected back to application (authenticated)
```
## Configuration Required
### 1. Azure Portal Setup
- Register application in Microsoft Entra ID
- Configure redirect URI:
- **Supabase Cloud**: `https://<supabase-ref>.supabase.co/auth/v1/callback`
- **Self-hosted**: `http://<your-host>:<port>/auth/v1/callback`
- Generate client ID and client secret
- Set API permissions (openid, profile, email)
### 2. Supabase Configuration
#### For Supabase Cloud:
Navigate to Authentication > Providers > Azure and configure:
- **Azure Client ID**: From Azure app registration
- **Azure Secret**: From Azure client secrets
- **Azure Tenant**: Use `common` for multi-tenant or specific tenant ID
#### For Self-Hosted Supabase:
Edit `supabase/config.toml`:
```toml
[auth.external.azure]
enabled = true
client_id = "env(AZURE_CLIENT_ID)"
secret = "env(AZURE_CLIENT_SECRET)"
redirect_uri = ""
url = "https://login.microsoftonline.com/env(AZURE_TENANT_ID)/v2.0"
skip_nonce_check = false
```
Set environment variables:
```bash
AZURE_CLIENT_ID="your-application-client-id"
AZURE_CLIENT_SECRET="your-client-secret"
AZURE_TENANT_ID="common" # or specific tenant ID
```
Restart Supabase:
```bash
docker-compose down && docker-compose up -d
```
### 3. Application Environment
Set in `.env` file:
```bash
VITE_ENABLE_MICROSOFT_LOGIN=true
```
## Testing
### Enable Microsoft Login
```bash
# In .env file
VITE_ENABLE_MICROSOFT_LOGIN=true
```
### Disable Microsoft Login
```bash
# In .env file
VITE_ENABLE_MICROSOFT_LOGIN=false
# Or simply remove the variable
```
## Features
**Hybrid Authentication**: Supports both email/password and Microsoft login
**Feature Flag**: Microsoft login can be enabled/disabled via environment variable
**Dark Mode Support**: Microsoft button matches the existing theme
**Error Handling**: Displays authentication errors to users
**Loading States**: Shows loading indicator during authentication
**Type Safety**: Full TypeScript support with proper types
## Next Steps
1. **Complete Azure Setup**: Follow [MICROSOFT_ENTRA_SETUP.md](./MICROSOFT_ENTRA_SETUP.md) for detailed configuration
2. **Configure Supabase**: Enable Azure provider in Supabase dashboard
3. **Test Authentication**: Verify the complete login flow
4. **Deploy**: Update production environment variables
## Documentation
- **Full Setup Guide**: [MICROSOFT_ENTRA_SETUP.md](./MICROSOFT_ENTRA_SETUP.md) - Complete Azure and Supabase configuration
- **Supabase Docs**: https://supabase.com/docs/guides/auth/social-login/auth-azure
- **Microsoft Identity Platform**: https://docs.microsoft.com/azure/active-directory/develop/
## Support
For issues or questions:
- Check [MICROSOFT_ENTRA_SETUP.md](./MICROSOFT_ENTRA_SETUP.md) troubleshooting section
- Review Supabase Auth logs
- Check Azure sign-in logs in Azure Portal
- Verify redirect URIs match exactly
---
**Implementation Date**: January 9, 2026
**Status**: Ready for configuration and testing

View File

@@ -0,0 +1,425 @@
# Microsoft Entra (Azure AD) OpenID Connect Setup Guide
## Overview
This guide walks you through configuring Microsoft Entra ID (formerly Azure Active Directory) authentication for the USDA Vision Management Dashboard using Supabase's Azure OAuth provider.
> **📌 Self-Hosted Supabase Users**: If you're using a self-hosted Supabase instance, see the simplified guide: [SELF_HOSTED_AZURE_SETUP.md](SELF_HOSTED_AZURE_SETUP.md). Self-hosted instances configure OAuth providers via `config.toml` and environment variables, not through the UI.
## Prerequisites
- Access to Azure Portal (https://portal.azure.com)
- Admin permissions to register applications in Azure AD
- Access to your Supabase project (Cloud dashboard or self-hosted instance)
- The USDA Vision application deployed and accessible via URL
## Step 1: Register Application in Microsoft Entra ID
### 1.1 Navigate to Azure Portal
1. Log in to [Azure Portal](https://portal.azure.com)
2. Navigate to **Microsoft Entra ID** (or **Azure Active Directory**)
3. Select **App registrations** from the left sidebar
4. Click **+ New registration**
### 1.2 Configure Application Registration
Fill in the following details:
- **Name**: `USDA Vision Management Dashboard` (or your preferred name)
- **Supported account types**: Choose one of:
- **Single tenant**: Only users in your organization can sign in (most restrictive)
- **Multitenant**: Users in any Azure AD tenant can sign in
- **Multitenant + personal Microsoft accounts**: Broadest support
- **Redirect URI**:
- Platform: **Web**
- URI: `https://<your-supabase-project-ref>.supabase.co/auth/v1/callback`
- Example: `https://abcdefghij.supabase.co/auth/v1/callback`
Click **Register** to create the application.
### 1.3 Note Application (Client) ID
After registration, you'll be taken to the app overview page. Copy and save:
- **Application (client) ID**: This is your `AZURE_CLIENT_ID`
- **Directory (tenant) ID**: This is your `AZURE_TENANT_ID`
## Step 2: Configure Client Secret
### 2.1 Create a Client Secret
1. In your app registration, navigate to **Certificates & secrets**
2. Click **+ New client secret**
3. Add a description: `Supabase Auth`
4. Choose an expiration period (recommendation: 12-24 months)
5. Click **Add**
### 2.2 Save the Secret Value
**IMPORTANT**: Copy the **Value** immediately - it will only be shown once!
- This is your `AZURE_CLIENT_SECRET`
- Store it securely (password manager, secure vault, etc.)
## Step 3: Configure API Permissions
### 3.1 Add Required Permissions
1. Navigate to **API permissions** in your app registration
2. Click **+ Add a permission**
3. Select **Microsoft Graph**
4. Choose **Delegated permissions**
5. Add the following permissions:
- `openid` (Sign users in)
- `profile` (View users' basic profile)
- `email` (View users' email address)
- `User.Read` (Sign in and read user profile)
6. Click **Add permissions**
### 3.2 Grant Admin Consent (Optional but Recommended)
If you have admin privileges:
1. Click **Grant admin consent for [Your Organization]**
2. Confirm the action
This prevents users from seeing a consent prompt on first login.
## Step 4: Configure Authentication Settings
### 4.1 Set Token Configuration
1. Navigate to **Token configuration** in your app registration
2. Click **+ Add optional claim**
3. Choose **ID** token type
4. Add the following claims:
- `email`
- `family_name`
- `given_name`
- `upn` (User Principal Name)
5. Check **Turn on the Microsoft Graph email, profile permission** if prompted
### 4.2 Configure Authentication Flow
1. Navigate to **Authentication** in your app registration
2. Under **Implicit grant and hybrid flows**, ensure:
-**ID tokens** (used for implicit and hybrid flows) is checked
3. Under **Allow public client flows**:
- Select **No** (keep it secure)
## Step 5: Configure Supabase
### For Supabase Cloud (Hosted)
#### 5.1 Navigate to Supabase Auth Settings
1. Log in to your [Supabase Dashboard](https://app.supabase.com)
2. Select your project
3. Navigate to **Authentication** > **Providers**
4. Find **Azure** in the provider list
#### 5.2 Enable and Configure Azure Provider
1. Toggle **Enable Sign in with Azure** to ON
2. Fill in the configuration:
- **Azure Client ID**: Paste your Application (client) ID from Step 1.3
- **Azure Secret**: Paste your client secret value from Step 2.2
- **Azure Tenant**: You have two options:
- Use `common` for multi-tenant applications
- Use your specific **Directory (tenant) ID** for single-tenant
- Format: Just the GUID (e.g., `12345678-1234-1234-1234-123456789012`)
3. Click **Save**
#### 5.3 Note the Callback URL
Supabase provides the callback URL in the format:
```
https://<your-project-ref>.supabase.co/auth/v1/callback
```
Verify this matches what you configured in Azure (Step 1.2).
### For Self-Hosted Supabase
If you're running a self-hosted Supabase instance, OAuth providers are configured via the `config.toml` file and environment variables rather than through the UI.
#### 5.1 Edit config.toml
1. Open your `supabase/config.toml` file
2. Find or add the `[auth.external.azure]` section:
```toml
[auth.external.azure]
enabled = true
client_id = "env(AZURE_CLIENT_ID)"
secret = "env(AZURE_CLIENT_SECRET)"
redirect_uri = ""
url = "https://login.microsoftonline.com/env(AZURE_TENANT_ID)/v2.0"
skip_nonce_check = false
```
3. Set `enabled = true` to activate Azure authentication
#### 5.2 Set Environment Variables
Create or update your environment file (`.env` or set in your deployment):
```bash
# Azure AD OAuth Configuration
AZURE_CLIENT_ID="your-application-client-id-from-azure"
AZURE_CLIENT_SECRET="your-client-secret-from-azure"
AZURE_TENANT_ID="common" # or your specific tenant ID
```
**Important**:
- Use `common` for multi-tenant (any Azure AD organization)
- Use `organizations` for any Azure AD organization (excludes personal Microsoft accounts)
- Use `consumers` for personal Microsoft accounts only
- Use your specific tenant ID (GUID) for single-tenant applications
#### 5.3 Update Azure Redirect URI
For self-hosted Supabase, your callback URL will be:
```
http://<your-host>:<supabase-port>/auth/v1/callback
```
For example, if your Supabase API is at `http://192.168.1.100:54321`:
```
http://192.168.1.100:54321/auth/v1/callback
```
**Go back to Azure Portal** (Step 1.2) and add this redirect URI to your app registration.
#### 5.4 Restart Supabase Services
After making these changes, restart your Supabase services:
```bash
# If using docker-compose
docker-compose down
docker-compose up -d
# Or if using the provided script
./docker-compose.sh restart
```
#### 5.5 Verify Configuration
Check that the auth service picked up your configuration:
```bash
# View auth service logs
docker-compose logs auth
# Or for specific service name
docker-compose logs supabase-auth
```
Look for log entries indicating Azure provider is enabled.
## Step 6: Configure Application Environment
### 6.1 Update Environment Variables
In your application's `.env` file, add or update:
```bash
# Supabase Configuration (if not already present)
VITE_SUPABASE_URL=https://<your-project-ref>.supabase.co
VITE_SUPABASE_ANON_KEY=<your-anon-key>
# Enable Microsoft Login
VITE_ENABLE_MICROSOFT_LOGIN=true
```
### 6.2 Restart Development Server
If running locally:
```bash
npm run dev
```
## Step 7: Test the Integration
### 7.1 Test Login Flow
1. Navigate to your application's login page
2. You should see a "Sign in with Microsoft" button
3. Click the button
4. You should be redirected to Microsoft's login page
5. Sign in with your Microsoft account
6. Grant consent if prompted
7. You should be redirected back to your application, logged in
### 7.2 Verify User Data
After successful login, check that:
- User profile information is available
- Email address is correctly populated
- User roles/permissions are properly assigned (if using RBAC)
### 7.3 Common Issues and Troubleshooting
#### Redirect URI Mismatch
**Error**: `AADSTS50011: The redirect URI ... does not match the redirect URIs configured`
**Solution**: Ensure the redirect URI in Azure matches exactly:
```
https://<your-project-ref>.supabase.co/auth/v1/callback
```
#### Invalid Client Secret
**Error**: `Invalid client secret provided`
**Solution**:
- Verify you copied the secret **Value**, not the Secret ID
- Generate a new client secret if the old one expired
#### Missing Permissions
**Error**: User consent required or permission denied
**Solution**:
- Add the required API permissions in Azure (Step 3)
- Grant admin consent if available
#### CORS Errors
**Error**: CORS policy blocking requests
**Solution**:
- Ensure your application URL is properly configured in Supabase
- Check that you're using the correct Supabase URL
## Step 8: Production Deployment
### 8.1 Update Redirect URI for Production
1. Go back to Azure Portal > App registrations
2. Navigate to **Authentication**
3. Add production redirect URI:
```
https://your-production-domain.com/
```
4. Ensure Supabase callback URI is still present
### 8.2 Set Production Environment Variables
Update your production environment with:
```bash
VITE_SUPABASE_URL=https://<your-project-ref>.supabase.co
VITE_SUPABASE_ANON_KEY=<your-production-anon-key>
VITE_ENABLE_MICROSOFT_LOGIN=true
```
### 8.3 Security Best Practices
1. **Rotate Secrets Regularly**: Set calendar reminders before client secret expiration
2. **Use Azure Key Vault**: Store secrets in Azure Key Vault for enhanced security
3. **Monitor Sign-ins**: Use Azure AD sign-in logs to monitor authentication activity
4. **Implement MFA**: Require multi-factor authentication for sensitive accounts
5. **Review Permissions**: Regularly audit API permissions and remove unnecessary ones
## Advanced Configuration
### Multi-Tenant Support
If you want to support users from multiple Azure AD tenants:
1. In Azure App Registration > **Authentication**:
- Set **Supported account types** to **Multitenant**
2. In Supabase Azure provider configuration:
- Set **Azure Tenant** to `common`
### Custom Domain Configuration
If using a custom domain with Supabase:
1. Configure custom domain in Supabase dashboard
2. Update redirect URI in Azure to use custom domain
3. Update application environment variables
### User Attribute Mapping
To map Azure AD attributes to Supabase user metadata:
1. Use Supabase database triggers or functions
2. Extract attributes from JWT token
3. Update user metadata in `auth.users` table
Example attributes available:
- `sub`: User's unique ID
- `email`: Email address
- `name`: Full name
- `given_name`: First name
- `family_name`: Last name
- `preferred_username`: Username
## Integration with RBAC System
To integrate Microsoft Entra authentication with your existing RBAC system:
### Option 1: Manual Role Assignment
After user signs in via Microsoft:
1. Admin assigns roles in the management dashboard
2. Roles stored in your `user_roles` table
3. User gets appropriate permissions on next login
### Option 2: Azure AD Group Mapping
Map Azure AD groups to application roles:
1. Configure group claims in Azure token configuration
2. Read groups from JWT token in Supabase
3. Automatically assign roles based on group membership
### Option 3: Hybrid Approach
Support both Microsoft and email/password login:
- Keep existing email/password authentication
- Add Microsoft as an additional option
- Users can link accounts or use either method
## Monitoring and Maintenance
### Azure AD Monitoring
- **Sign-in logs**: Monitor authentication attempts
- **Audit logs**: Track configuration changes
- **Usage analytics**: Understand authentication patterns
### Supabase Monitoring
- **Auth logs**: View authentication events
- **User activity**: Track active users
- **Error logs**: Identify authentication issues
### Regular Maintenance Tasks
- [ ] Rotate client secrets before expiration (set reminders)
- [ ] Review and update API permissions quarterly
- [ ] Audit user access and remove inactive users
- [ ] Update token configuration as needed
- [ ] Test authentication flow after any infrastructure changes
## Support and Resources
### Microsoft Documentation
- [Microsoft identity platform documentation](https://docs.microsoft.com/azure/active-directory/develop/)
- [Azure AD app registration](https://docs.microsoft.com/azure/active-directory/develop/quickstart-register-app)
- [OpenID Connect on Microsoft identity platform](https://docs.microsoft.com/azure/active-directory/develop/v2-protocols-oidc)
### Supabase Documentation
- [Supabase Auth with Azure](https://supabase.com/docs/guides/auth/social-login/auth-azure)
- [Supabase Auth API reference](https://supabase.com/docs/reference/javascript/auth-signinwithoauth)
### Troubleshooting Resources
- Azure AD Error Codes: https://docs.microsoft.com/azure/active-directory/develop/reference-aadsts-error-codes
- Supabase Discord Community: https://discord.supabase.com
---
**Last Updated**: January 2026
**Maintained By**: USDA Vision System Team

139
docs/OAUTH_USER_SYNC_FIX.md Normal file
View File

@@ -0,0 +1,139 @@
# OAuth User Synchronization Fix
## Problem
When a user signs on with an OAuth provider (Microsoft Entra/Azure AD) for the first time, the user is added to `auth.users` in Supabase but NOT to the application's `user_profiles` table. This causes the application to fail when trying to load user data, as there's no user profile available.
## Solution
Implemented automatic user profile creation for OAuth users through a multi-layered approach:
### 1. Database Trigger (00003_oauth_user_sync.sql)
- **Location**: `supabase/migrations/00003_oauth_user_sync.sql` and `management-dashboard-web-app/supabase/migrations/00003_oauth_user_sync.sql`
- **Function**: `handle_new_oauth_user()`
- Automatically creates a user profile in `public.user_profiles` when a new user is created in `auth.users`
- Handles the synchronization at the database level, ensuring it works regardless of where users are created
- Includes race condition handling with `ON CONFLICT DO NOTHING`
- **Trigger**: `on_auth_user_created`
- Fires after INSERT on `auth.users`
- Ensures every new OAuth user gets a profile entry
### 2. Client-Side Utility Function (src/lib/supabase.ts)
- **Function**: `userManagement.syncOAuthUser()`
- Provides a fallback synchronization mechanism for any OAuth users that slip through
- Checks if user profile exists before creating
- Handles race conditions gracefully (duplicate key errors)
- Includes comprehensive error logging for debugging
**Logic**:
```typescript
1. Get current authenticated user from Supabase Auth
2. Check if user profile already exists in user_profiles table
3. If exists: return (no action needed)
4. If not exists: create user profile with:
- id: user's UUID from auth.users
- email: user's email from auth.users
- status: 'active' (default status)
5. Handle errors gracefully:
- "No rows returned" (PGRST116): Expected, user doesn't exist yet
- "Duplicate key" (23505): Race condition, another process created it first
- Other errors: Log and continue
```
### 3. App Integration (src/App.tsx)
- **Updated**: Auth state change listener
- **Triggers**: When `SIGNED_IN` or `INITIAL_SESSION` event occurs
- **Action**: Calls `userManagement.syncOAuthUser()` asynchronously
- **Benefit**: Ensures user profile exists before the rest of the app tries to access it
## How It Works
### OAuth Sign-In Flow (New)
```
1. User clicks "Sign in with Microsoft"
2. Redirected to Microsoft login
3. Microsoft authenticates and redirects back
4. Supabase creates entry in auth.users
5. Database trigger fires → user_profiles entry created
6. App receives SIGNED_IN event
7. App calls syncOAuthUser() as extra safety measure
8. User profile is guaranteed to exist
9. getUserProfile() and loadData() succeed
```
## Backward Compatibility
- **Non-invasive**: The solution uses triggers and utility functions, doesn't modify existing tables
- **Graceful degradation**: If either layer fails, the other provides a fallback
- **No breaking changes**: Existing APIs and components remain unchanged
## Testing Recommendations
### Test 1: First-Time OAuth Sign-In
1. Clear browser cookies/session
2. Click "Sign in with Microsoft"
3. Complete OAuth flow
4. Verify:
- User is in Supabase auth
- User is in `user_profiles` table
- App loads user data without errors
- Dashboard displays correctly
### Test 2: Verify Database Trigger
1. Directly create a user in `auth.users` via SQL
2. Verify that `user_profiles` entry is automatically created
3. Check timestamp to confirm trigger fired
### Test 3: Verify Client-Side Fallback
1. Manually delete a user's `user_profiles` entry
2. Reload the app
3. Verify that `syncOAuthUser()` recreates the profile
4. Check browser console for success logs
## Files Modified
1. **Database Migrations**:
- `/usda-vision/supabase/migrations/00003_oauth_user_sync.sql` (NEW)
- `/usda-vision/management-dashboard-web-app/supabase/migrations/00003_oauth_user_sync.sql` (NEW)
2. **TypeScript/React**:
- `/usda-vision/management-dashboard-web-app/src/lib/supabase.ts` (MODIFIED)
- Added `syncOAuthUser()` method to `userManagement` object
- `/usda-vision/management-dashboard-web-app/src/App.tsx` (MODIFIED)
- Import `userManagement`
- Call `syncOAuthUser()` on auth state change
## Deployment Steps
1. **Apply Database Migrations**:
```bash
# Run the new migration
supabase migration up
```
2. **Deploy Application Code**:
- Push the changes to `src/lib/supabase.ts` and `src/App.tsx`
- No environment variable changes needed
- No configuration changes needed
3. **Test in Staging**:
- Test OAuth sign-in with a fresh account
- Verify user profile is created
- Check app functionality
4. **Monitor in Production**:
- Watch browser console for any errors from `syncOAuthUser()`
- Check database logs to confirm trigger is firing
- Monitor user creation metrics
## Future Enhancements
- Assign default roles to new OAuth users (currently requires manual assignment)
- Pre-populate `first_name` and `last_name` from OAuth provider data
- Add user profile completion workflow for new OAuth users
- Auto-disable account creation for users outside organization

View File

@@ -0,0 +1,158 @@
# Self-Hosted Supabase - Microsoft Entra Setup
## Quick Setup Guide
For self-hosted Supabase instances, OAuth providers like Microsoft Entra (Azure AD) are configured through config files and environment variables, not through the UI.
### Step 1: Configure Azure Application
Follow steps 1-4 in [MICROSOFT_ENTRA_SETUP.md](MICROSOFT_ENTRA_SETUP.md) to:
1. Register your app in Azure Portal
2. Get your Client ID and Secret
3. Set up API permissions
4. Configure token claims
**Important**: Your redirect URI should be:
```
http://<your-host-ip>:<supabase-port>/auth/v1/callback
```
Example: `http://192.168.1.100:54321/auth/v1/callback`
### Step 2: Configure Supabase
The Azure provider configuration is already added to `supabase/config.toml`:
```toml
[auth.external.azure]
enabled = false # Change this to true
client_id = "env(AZURE_CLIENT_ID)"
secret = "env(AZURE_CLIENT_SECRET)"
redirect_uri = ""
url = "https://login.microsoftonline.com/env(AZURE_TENANT_ID)/v2.0"
skip_nonce_check = false
```
### Step 3: Set Environment Variables
1. Copy the example file:
```bash
cp .env.azure.example .env.azure
```
2. Edit `.env.azure` with your actual values:
```bash
AZURE_CLIENT_ID=your-application-client-id
AZURE_CLIENT_SECRET=your-client-secret
AZURE_TENANT_ID=common # or your specific tenant ID
```
3. Source the environment file before starting Supabase:
```bash
source .env.azure
```
Or add it to your docker-compose environment.
### Step 4: Enable Azure Provider
Edit `supabase/config.toml` and change:
```toml
[auth.external.azure]
enabled = true # Change from false to true
```
### Step 5: Restart Supabase
```bash
docker-compose down
docker-compose up -d
```
Or if using the project script:
```bash
./docker-compose.sh restart
```
### Step 6: Enable in Application
In `management-dashboard-web-app/.env`:
```bash
VITE_ENABLE_MICROSOFT_LOGIN=true
```
### Verification
1. Check auth service logs:
```bash
docker-compose logs auth | grep -i azure
```
2. You should see the Microsoft login button on your application's login page
3. Click it and verify you're redirected to Microsoft login
### Troubleshooting
#### Azure Provider Not Working
**Check logs**:
```bash
docker-compose logs auth
```
**Verify environment variables are loaded**:
```bash
docker-compose exec auth env | grep AZURE
```
#### Redirect URI Mismatch
Ensure the redirect URI in Azure exactly matches:
```
http://<your-host-ip>:<supabase-port>/auth/v1/callback
```
Common mistake: Using `localhost` instead of the actual IP address.
#### Environment Variables Not Set
If you see errors about missing AZURE variables, make sure to:
1. Export them in your shell before running docker-compose
2. Or add them to your docker-compose.yml environment section
3. Or use a .env file that docker-compose automatically loads
### Docker Compose Environment Variables
You can also add the variables directly to your `docker-compose.yml`:
```yaml
services:
auth:
environment:
AZURE_CLIENT_ID: ${AZURE_CLIENT_ID}
AZURE_CLIENT_SECRET: ${AZURE_CLIENT_SECRET}
AZURE_TENANT_ID: ${AZURE_TENANT_ID:-common}
```
Then create a `.env` file in the same directory:
```bash
AZURE_CLIENT_ID=your-client-id
AZURE_CLIENT_SECRET=your-secret
AZURE_TENANT_ID=common
```
### Security Notes
- Never commit `.env.azure` or `.env` files with real secrets to git
- Add them to `.gitignore`
- Use environment variable substitution in config.toml
- Rotate client secrets regularly (before expiration)
- Monitor sign-in logs in Azure Portal
### Additional Resources
- Full setup guide: [MICROSOFT_ENTRA_SETUP.md](MICROSOFT_ENTRA_SETUP.md)
- Quick reference: [MICROSOFT_ENTRA_QUICKSTART.md](MICROSOFT_ENTRA_QUICKSTART.md)
- Supabase self-hosting docs: https://supabase.com/docs/guides/self-hosting
- Azure OAuth docs: https://docs.microsoft.com/azure/active-directory/develop/v2-oauth2-auth-code-flow

View File

@@ -5,6 +5,7 @@ The Supabase containers are now integrated into the main `docker-compose.yml` fi
## What Changed
All Supabase services are now defined in the root `docker-compose.yml`:
- **supabase-db**: PostgreSQL database (port 54322)
- **supabase-rest**: PostgREST API (port 54321)
- **supabase-auth**: GoTrue authentication service (port 9999)
@@ -48,27 +49,30 @@ VITE_SUPABASE_ANON_KEY=<your-anon-key>
```
The default anon key for local development is:
```
eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJzdXBhYmFzZS1kZW1vIiwicm9sZSI6ImFub24iLCJleHAiOjE5ODM4MTI5OTZ9.CRXP1A7WOeoJeXxjNni43kdQwgnWNReilDMblYTn_I0
[REDACTED]
```
### Migrations
Migrations are automatically run on first startup via the `supabase-migrate` service. The service:
1. Waits for the database to be ready
2. Runs all migrations from `supabase/migrations/` in alphabetical order
3. Runs seed files (`seed_01_users.sql` and `seed_02_phase2_experiments.sql`)
If you need to re-run migrations, you can:
1. Stop the containers: `docker compose down`
2. Remove the database volume: `docker volume rm usda-vision_supabase-db`
3. Start again: `docker compose up -d`
### Accessing Services
- **Supabase API**: http://localhost:54321
- **Supabase Studio**: http://localhost:54323
- **Email Testing (Inbucket)**: http://localhost:54324
- **Supabase API**: <http://localhost:54321>
- **Supabase Studio**: <http://localhost:54323>
- **Email Testing (Inbucket)**: <http://localhost:54324>
- **Database (direct)**: localhost:54322
### Network
@@ -88,12 +92,14 @@ If you were previously using `supabase start` from the `management-dashboard-web
### Port Conflicts
If you get port conflicts, make sure:
- No other Supabase instances are running
- The Supabase CLI isn't running containers (`supabase stop` if needed)
### Migration Issues
If migrations fail:
1. Check the logs: `docker compose logs supabase-migrate`
2. Ensure migration files are valid SQL
3. You may need to manually connect to the database and fix issues
@@ -101,7 +107,7 @@ If migrations fail:
### Database Connection Issues
If services can't connect to the database:
1. Check database is healthy: `docker compose ps supabase-db`
2. Check logs: `docker compose logs supabase-db`
3. Ensure the database password matches across all services

View File

@@ -25,6 +25,7 @@ docker compose up -d
### For Supabase CLI Users
**Before** (old way):
```bash
cd management-dashboard-web-app
supabase start
@@ -32,6 +33,7 @@ supabase db reset
```
**After** (new way):
```bash
# From project root - no need to cd!
supabase start
@@ -50,23 +52,22 @@ If you have scripts or documentation that reference the old path, update them:
-`management-dashboard-web-app/supabase/config.toml`
-`supabase/config.toml`
## Backward Compatibility
The old directory (`management-dashboard-web-app/supabase/`) can be kept for reference, but it's no longer used by docker-compose or the Supabase CLI. You can safely remove it after verifying everything works:
## Current State
```bash
# After verifying everything works with the new location
rm -rf management-dashboard-web-app/supabase
```
The old directory (`management-dashboard-web-app/supabase/`) has been removed. All Supabase and DB configuration, migrations, and seeds now live only under the project root `supabase/` directory. Docker Compose and the Supabase CLI use root `supabase/` exclusively.
## Verification
To verify the migration worked:
To verify the migration:
1. **Check docker-compose paths** (only root supabase should be referenced):
1. **Check docker-compose paths**:
```bash
grep -r "supabase" docker-compose.yml
# Should show: ./supabase/ (not ./management-dashboard-web-app/supabase/)
grep "supabase" docker-compose.yml
# Should show only ./supabase/ (no management-dashboard-web-app/supabase/)
```
2. **Test Supabase CLI**:
@@ -76,7 +77,8 @@ To verify the migration worked:
# Should work without needing to cd into management-dashboard-web-app
```
3. **Test migrations**:
1. **Test migrations**:
```bash
docker compose up -d
docker compose logs supabase-migrate
@@ -90,4 +92,3 @@ To verify the migration worked:
✅ Easier to share database across services
✅ Better alignment with monorepo best practices
✅ Infrastructure separated from application code

303
docs/database_entities.md Normal file
View File

@@ -0,0 +1,303 @@
# Database Entities Documentation
This document describes the core entities in the USDA Vision database schema, focusing on entity-specific attributes (excluding generic fields like `id`, `created_at`, `updated_at`, `created_by`).
## Entity Relationships Overview
```
Experiment Phase (Template)
Experiment
Experiment Repetition
Experiment Phase Execution (Soaking, Airdrying, Cracking, Shelling)
```
---
## 1. Experiment Phase
**Table:** `experiment_phases`
**Purpose:** Defines a template/blueprint for experiments that specifies which processing phases are included and their configuration.
### Attributes
- **name** (TEXT, UNIQUE, NOT NULL)
- Unique name identifying the experiment phase template
- Example: "Phase 2 - Standard Processing"
- **description** (TEXT, nullable)
- Optional description providing details about the experiment phase
- **has_soaking** (BOOLEAN, NOT NULL, DEFAULT false)
- Indicates whether soaking phase is included in experiments using this template
- **has_airdrying** (BOOLEAN, NOT NULL, DEFAULT false)
- Indicates whether airdrying phase is included in experiments using this template
- **has_cracking** (BOOLEAN, NOT NULL, DEFAULT false)
- Indicates whether cracking phase is included in experiments using this template
- **has_shelling** (BOOLEAN, NOT NULL, DEFAULT false)
- Indicates whether shelling phase is included in experiments using this template
- **cracking_machine_type_id** (UUID, nullable)
- References the machine type to be used for cracking (required if `has_cracking` is true)
- Links to `machine_types` table
### Constraints
- At least one phase (soaking, airdrying, cracking, or shelling) must be enabled
- If `has_cracking` is true, `cracking_machine_type_id` must be provided
---
## 2. Experiment
**Table:** `experiments`
**Purpose:** Defines an experiment blueprint that specifies the parameters and requirements for conducting pecan processing experiments.
### Attributes
- **experiment_number** (INTEGER, NOT NULL)
- Unique number identifying the experiment
- Combined with `phase_id` must be unique
- **reps_required** (INTEGER, NOT NULL, CHECK > 0)
- Number of repetitions required for this experiment
- Must be greater than zero
- **weight_per_repetition_lbs** (DOUBLE PRECISION, NOT NULL, DEFAULT 5.0, CHECK > 0)
- Weight in pounds required for each repetition of the experiment
- **results_status** (TEXT, NOT NULL, DEFAULT 'valid', CHECK IN ('valid', 'invalid'))
- Status indicating whether the experiment results are considered valid or invalid
- **completion_status** (BOOLEAN, NOT NULL, DEFAULT false)
- Indicates whether the experiment has been completed
- **phase_id** (UUID, NOT NULL)
- References the experiment phase template this experiment belongs to
- Links to `experiment_phases` table
### Constraints
- Combination of `experiment_number` and `phase_id` must be unique
---
## 3. Experiment Repetition
**Table:** `experiment_repetitions`
**Purpose:** Represents a single execution instance of an experiment that can be scheduled and tracked.
### Attributes
- **experiment_id** (UUID, NOT NULL)
- References the parent experiment blueprint
- Links to `experiments` table
- **repetition_number** (INTEGER, NOT NULL, CHECK > 0)
- Sequential number identifying this repetition within the experiment
- Must be unique per experiment
- **scheduled_date** (TIMESTAMP WITH TIME ZONE, nullable)
- Date and time when the repetition is scheduled to be executed
- **status** (TEXT, NOT NULL, DEFAULT 'pending', CHECK IN ('pending', 'in_progress', 'completed', 'cancelled'))
- Current status of the repetition execution
- Values: `pending`, `in_progress`, `completed`, `cancelled`
### Constraints
- Combination of `experiment_id` and `repetition_number` must be unique
---
## 4. Experiment Phase Executions
**Table:** `experiment_phase_executions`
**Purpose:** Unified table storing execution data for all phase types (soaking, airdrying, cracking, shelling) associated with experiment repetitions.
### Common Attributes (All Phase Types)
- **repetition_id** (UUID, NOT NULL)
- References the experiment repetition this phase execution belongs to
- Links to `experiment_repetitions` table
- **phase_type** (TEXT, NOT NULL, CHECK IN ('soaking', 'airdrying', 'cracking', 'shelling'))
- Type of phase being executed
- Must be one of: `soaking`, `airdrying`, `cracking`, `shelling`
- **scheduled_start_time** (TIMESTAMP WITH TIME ZONE, NOT NULL)
- Planned start time for the phase execution
- **scheduled_end_time** (TIMESTAMP WITH TIME ZONE, nullable)
- Planned end time for the phase execution
- Automatically calculated for soaking and airdrying based on duration
- **actual_start_time** (TIMESTAMP WITH TIME ZONE, nullable)
- Actual time when the phase execution started
- **actual_end_time** (TIMESTAMP WITH TIME ZONE, nullable)
- Actual time when the phase execution ended
- **status** (TEXT, NOT NULL, DEFAULT 'pending', CHECK IN ('pending', 'scheduled', 'in_progress', 'completed', 'cancelled'))
- Current status of the phase execution
### Phase-Specific Concepts: Independent & Dependent Variables
> **Note:** This section describes the conceptual variables for each phase (what we measure or control), not necessarily the current physical columns in the database. Some of these variables will be added to the schema in future migrations.
#### Soaking Phase
- **Independent variables (IV)**
- **Pre-soaking shell moisture percentage**
- Moisture percentage of the shell **before soaking**.
- **Pre-soaking kernel moisture percentage**
- Moisture percentage of the kernel **before soaking**.
- **Average pecan diameter (inches)**
- Average diameter of pecans in the batch, measured in inches.
- **Batch weight**
- Total weight of the batch being soaked.
- **Soaking duration (minutes)**
- Duration of the soaking process in minutes (currently represented as `soaking_duration_minutes`).
- **Dependent variables (DV)**
- **Post-soaking shell moisture percentage**
- Moisture percentage of the shell **after soaking**.
- **Post-soaking kernel moisture percentage**
- Moisture percentage of the kernel **after soaking**.
#### Airdrying Phase
- **Independent variables (IV)**
- **Airdrying duration (minutes)**
- Duration of the airdrying process in minutes (currently represented as `duration_minutes`).
- **Dependent variables (DV)**
- *(TBD — moisture/other measurements after airdrying can be added here when finalized.)*
#### Cracking Phase
- **Independent variables (IV)**
- **Cracking machine type**
- The type of cracking machine used (linked via `machine_type_id` and `experiment_phases.cracking_machine_type_id`).
- **Dependent variables (DV)**
- *None defined yet for cracking.*
Business/analysis metrics for cracking can be added later (e.g., crack quality, breakage rates).
#### Shelling Phase
- **Independent variables (IV)**
- **Shelling machine configuration parameters** (not yet present in the DB schema)
- **Ring gap (inches)**
- Radial gap setting of the shelling ring (e.g., `0.34` inches).
- **Paddle RPM**
- Rotational speed of the paddles (integer RPM value).
- **Third machine parameter (TBD)**
- The shelling machine expects a third configuration parameter that will be defined and added to the schema later.
- **Dependent variables (DV)**
- **Half-yield ratio (percentage)**
- Percentage of the shelled product that is composed of half kernels, relative to total yield.
### Constraints
- Combination of `repetition_id` and `phase_type` must be unique (one execution per phase type per repetition)
### Notes
- Phase executions are automatically created when an experiment repetition is created, based on the experiment phase template configuration
- Sequential phases (soaking → airdrying → cracking → shelling) automatically calculate their `scheduled_start_time` based on the previous phase's `scheduled_end_time`
- For soaking and airdrying phases, `scheduled_end_time` is automatically calculated from `scheduled_start_time` + duration
---
## 5. Machine Types
**Table:** `machine_types`
**Purpose:** Defines the types of machines available for cracking operations.
### Attributes
- **name** (TEXT, UNIQUE, NOT NULL)
- Unique name identifying the machine type
- Example: "JC Cracker", "Meyer Cracker"
- **description** (TEXT, nullable)
- Optional description of the machine type
### Related Tables
Machine types are referenced by:
- `experiment_phases.cracking_machine_type_id` - Defines which machine type to use for cracking in an experiment phase template
- `experiment_phase_executions.machine_type_id` - Specifies which machine was used for a specific cracking execution
---
## 6. Cracker Parameters (Machine-Specific)
### JC Cracker Parameters
**Table:** `jc_cracker_parameters`
**Purpose:** Stores parameters specific to JC Cracker machines.
#### Attributes
- **plate_contact_frequency_hz** (DOUBLE PRECISION, NOT NULL, CHECK > 0)
- Frequency of plate contact in Hertz
- **throughput_rate_pecans_sec** (DOUBLE PRECISION, NOT NULL, CHECK > 0)
- Rate of pecan processing in pecans per second
- **crush_amount_in** (DOUBLE PRECISION, NOT NULL, CHECK >= 0)
- Amount of crushing in inches
- **entry_exit_height_diff_in** (DOUBLE PRECISION, NOT NULL)
- Difference in height between entry and exit points in inches
### Meyer Cracker Parameters
**Table:** `meyer_cracker_parameters`
**Purpose:** Stores parameters specific to Meyer Cracker machines.
#### Attributes
- **motor_speed_hz** (DOUBLE PRECISION, NOT NULL, CHECK > 0)
- Motor speed in Hertz
- **jig_displacement_inches** (DOUBLE PRECISION, NOT NULL)
- Jig displacement in inches
- **spring_stiffness_nm** (DOUBLE PRECISION, NOT NULL, CHECK > 0)
- Spring stiffness in Newton-meters
---
## Summary of Entity Relationships
1. **Experiment Phase** → Defines which phases are included and machine type for cracking
2. **Experiment** → Belongs to an Experiment Phase, defines repetition requirements and weight per repetition
3. **Experiment Repetition** → Instance of an Experiment, can be scheduled and tracked
4. **Experiment Phase Execution** → Execution record for each phase (soaking, airdrying, cracking, shelling) within a repetition
5. **Machine Types** → Defines available cracking machines
6. **Cracker Parameters** → Machine-specific operational parameters (JC or Meyer)
### Key Relationships
- One Experiment Phase can have many Experiments
- One Experiment can have many Experiment Repetitions
- One Experiment Repetition can have multiple Phase Executions (one per phase type)
- Phase Executions are automatically created based on the Experiment Phase template configuration
- Cracking Phase Executions reference a Machine Type
- Machine Types can have associated Cracker Parameters (JC or Meyer specific)

61
flake.lock generated Normal file
View File

@@ -0,0 +1,61 @@
{
"nodes": {
"flake-utils": {
"inputs": {
"systems": "systems"
},
"locked": {
"lastModified": 1731533236,
"narHash": "sha256-l0KFg5HjrsfsO/JpG+r7fRrqm12kzFHyUHqHCVpMMbI=",
"owner": "numtide",
"repo": "flake-utils",
"rev": "11707dc2f618dd54ca8739b309ec4fc024de578b",
"type": "github"
},
"original": {
"owner": "numtide",
"repo": "flake-utils",
"type": "github"
}
},
"nixpkgs": {
"locked": {
"lastModified": 1769461804,
"narHash": "sha256-msG8SU5WsBUfVVa/9RPLaymvi5bI8edTavbIq3vRlhI=",
"owner": "NixOS",
"repo": "nixpkgs",
"rev": "bfc1b8a4574108ceef22f02bafcf6611380c100d",
"type": "github"
},
"original": {
"owner": "NixOS",
"ref": "nixos-unstable",
"repo": "nixpkgs",
"type": "github"
}
},
"root": {
"inputs": {
"flake-utils": "flake-utils",
"nixpkgs": "nixpkgs"
}
},
"systems": {
"locked": {
"lastModified": 1681028828,
"narHash": "sha256-Vy1rq5AaRuLzOxct8nz4T6wlgyUR7zLU309k9mBC768=",
"owner": "nix-systems",
"repo": "default",
"rev": "da67096a3b9bf56a91d16901293e51ba5b49a27e",
"type": "github"
},
"original": {
"owner": "nix-systems",
"repo": "default",
"type": "github"
}
}
},
"root": "root",
"version": 7
}

105
flake.nix Normal file
View File

@@ -0,0 +1,105 @@
{
description = "USDA Vision camera management system";
inputs = {
nixpkgs.url = "github:NixOS/nixpkgs/nixos-unstable";
flake-utils.url = "github:numtide/flake-utils";
};
outputs = { self, nixpkgs, flake-utils }:
{
# NixOS module (system-independent)
nixosModules.default = import ./module.nix;
}
//
flake-utils.lib.eachDefaultSystem (system:
let
pkgs = import nixpkgs {
inherit system;
config.allowUnfree = true;
};
# Import our package definition
usda-vision-package = pkgs.callPackage ./package.nix { };
camera-sdk = pkgs.callPackage ./camera-sdk.nix { };
in
{
packages = {
default = usda-vision-package;
usda-vision = usda-vision-package;
camera-sdk = camera-sdk;
};
devShells.default = pkgs.mkShell {
name = "usda-vision-dev";
# Input packages for the development shell
buildInputs = with pkgs; [
# Core development tools
git
vim
curl
wget
# Docker for local development
docker
docker-compose
# Supabase CLI
supabase-cli
# Node.js for web app development
nodejs_20
nodePackages.npm
nodePackages.pnpm
# Python for camera API
python311
python311Packages.pip
python311Packages.virtualenv
# Camera SDK
camera-sdk
# Utilities
jq
yq
rsync
gnused
gawk
];
# Environment variables for development
shellHook = ''
export LD_LIBRARY_PATH="${camera-sdk}/lib:$LD_LIBRARY_PATH"
export CAMERA_SDK_PATH="${camera-sdk}"
# Set up Python virtual environment
if [ ! -d .venv ]; then
echo "Creating Python virtual environment..."
python -m venv .venv
fi
echo "USDA Vision Development Environment"
echo "===================================="
echo "Camera SDK: ${camera-sdk}"
echo ""
echo "Available commands:"
echo " - docker-compose: Manage containers"
echo " - supabase: Supabase CLI"
echo ""
echo "To activate Python venv: source .venv/bin/activate"
echo "To edit secrets: ragenix -e secrets/env.age"
echo ""
echo "NOTE: Secrets should be managed by ragenix in athenix for production deployments"
echo ""
'';
# Additional environment configuration
DOCKER_BUILDKIT = "1";
COMPOSE_DOCKER_CLI_BUILD = "1";
};
}
);
}

View File

@@ -1,4 +1,4 @@
# Local Supabase config for Vite dev server
VITE_SUPABASE_URL=http://127.0.0.1:54321
VITE_SUPABASE_ANON_KEY=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJzdXBhYmFzZS1kZW1vIiwicm9sZSI6ImFub24iLCJleHAiOjE5ODM4MTI5OTZ9.CRXP1A7WOeoJeXxjNni43kdQwgnWNReilDMblYTn_I0
VITE_SUPABASE_ANON_KEY=[REDACTED]

View File

@@ -1,6 +1,6 @@
# Local Supabase config for Vite dev server
VITE_SUPABASE_URL=http://exp-dash:54321
VITE_SUPABASE_ANON_KEY=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJzdXBhYmFzZS1kZW1vIiwicm9sZSI6ImFub24iLCJleHAiOjE5ODM4MTI5OTZ9.CRXP1A7WOeoJeXxjNni43kdQwgnWNReilDMblYTn_I0
VITE_SUPABASE_ANON_KEY=[REDACTED]
# Vision API Configuration

View File

@@ -10,3 +10,10 @@ VITE_VISION_SYSTEM_REMOTE_URL=http://exp-dash:3002/assets/remoteEntry.js?v=$(dat
# API URLs
VITE_VISION_API_URL=http://exp-dash:8000
VITE_MEDIA_API_URL=http://exp-dash:8090
# Supabase Configuration
VITE_SUPABASE_URL=https://your-project-url.supabase.co
VITE_SUPABASE_ANON_KEY=your-anon-key
# Microsoft Entra (Azure AD) OAuth Configuration
VITE_ENABLE_MICROSOFT_LOGIN=true

View File

@@ -1,5 +1,5 @@
import { useState, useEffect } from 'react'
import { supabase } from './lib/supabase'
import { supabase, userManagement } from './lib/supabase'
import { Login } from './components/Login'
import { Dashboard } from './components/Dashboard'
import { CameraRoute } from './components/CameraRoute'
@@ -14,11 +14,18 @@ function App() {
checkAuthState()
// Listen for auth changes
const { data: { subscription } } = supabase.auth.onAuthStateChange((event, session) => {
const { data: { subscription } } = supabase.auth.onAuthStateChange((event: string, session: any) => {
console.log('Auth state changed:', event, !!session)
setIsAuthenticated(!!session)
setLoading(false)
// Sync OAuth user on successful sign in (creates user profile if needed)
if ((event === 'SIGNED_IN' || event === 'INITIAL_SESSION') && session) {
userManagement.syncOAuthUser().catch((err) => {
console.error('Failed to sync OAuth user:', err)
})
}
// Handle signout route
if (event === 'SIGNED_OUT') {
setCurrentRoute('/')

View File

@@ -26,9 +26,8 @@ export function DashboardLayout({ onLogout, currentRoute }: DashboardLayoutProps
const [loading, setLoading] = useState(true)
const [error, setError] = useState<string | null>(null)
const [currentView, setCurrentView] = useState('dashboard')
const [isExpanded, setIsExpanded] = useState(true)
const [isExpanded, setIsExpanded] = useState(false)
const [isMobileOpen, setIsMobileOpen] = useState(false)
const [isHovered, setIsHovered] = useState(false)
// Valid dashboard views
const validViews = ['dashboard', 'user-management', 'experiments', 'analytics', 'data-entry', 'vision-system', 'scheduling', 'video-library', 'profile']
@@ -53,6 +52,26 @@ export function DashboardLayout({ onLogout, currentRoute }: DashboardLayoutProps
}
}
// Save sidebar expanded state to localStorage
const saveSidebarState = (expanded: boolean) => {
try {
localStorage.setItem('sidebar-expanded', String(expanded))
} catch (error) {
console.warn('Failed to save sidebar state to localStorage:', error)
}
}
// Get saved sidebar state from localStorage
const getSavedSidebarState = (): boolean => {
try {
const saved = localStorage.getItem('sidebar-expanded')
return saved === 'true'
} catch (error) {
console.warn('Failed to get saved sidebar state from localStorage:', error)
return false
}
}
// Check if user has access to a specific view
const hasAccessToView = (view: string): boolean => {
if (!user) return false
@@ -80,6 +99,9 @@ export function DashboardLayout({ onLogout, currentRoute }: DashboardLayoutProps
useEffect(() => {
fetchUserProfile()
// Load saved sidebar state
const savedSidebarState = getSavedSidebarState()
setIsExpanded(savedSidebarState)
}, [])
// Restore saved view when user is loaded
@@ -144,7 +166,9 @@ export function DashboardLayout({ onLogout, currentRoute }: DashboardLayoutProps
}
const toggleSidebar = () => {
setIsExpanded(!isExpanded)
const newState = !isExpanded
setIsExpanded(newState)
saveSidebarState(newState)
}
const toggleMobileSidebar = () => {
@@ -225,7 +249,7 @@ export function DashboardLayout({ onLogout, currentRoute }: DashboardLayoutProps
)
case 'scheduling':
return (
<ErrorBoundary>
<ErrorBoundary autoRetry={true} retryDelay={2000} maxRetries={3}>
<Suspense fallback={<div className="p-6">Loading scheduling module...</div>}>
<RemoteScheduling user={user} currentRoute={currentRoute} />
</Suspense>
@@ -300,8 +324,6 @@ export function DashboardLayout({ onLogout, currentRoute }: DashboardLayoutProps
onViewChange={handleViewChange}
isExpanded={isExpanded}
isMobileOpen={isMobileOpen}
isHovered={isHovered}
setIsHovered={setIsHovered}
/>
{/* Backdrop for mobile */}
{isMobileOpen && (
@@ -312,7 +334,7 @@ export function DashboardLayout({ onLogout, currentRoute }: DashboardLayoutProps
)}
</div>
<div
className={`flex-1 transition-all duration-300 ease-in-out bg-gray-50 dark:bg-gray-900 flex flex-col min-h-0 ${isExpanded || isHovered ? "lg:ml-[290px]" : "lg:ml-[90px]"
className={`flex-1 transition-all duration-300 ease-in-out bg-gray-50 dark:bg-gray-900 flex flex-col min-h-0 ${isExpanded ? "lg:ml-[290px]" : "lg:ml-[90px]"
} ${isMobileOpen ? "ml-0" : ""}`}
>
<TopNavbar

View File

@@ -5,20 +5,61 @@ type Props = {
fallback?: ReactNode
onRetry?: () => void
showRetry?: boolean
autoRetry?: boolean
retryDelay?: number
maxRetries?: number
}
type State = { hasError: boolean }
type State = { hasError: boolean; retryCount: number }
export class ErrorBoundary extends Component<Props, State> {
state: State = { hasError: false }
private retryTimeoutId?: NodeJS.Timeout
state: State = { hasError: false, retryCount: 0 }
static getDerivedStateFromError() {
return { hasError: true }
}
componentDidCatch() {}
componentDidCatch(error: Error, errorInfo: React.ErrorInfo) {
// Auto-retry logic for module federation loading issues
const maxRetries = this.props.maxRetries || 3
if (this.props.autoRetry !== false && this.state.retryCount < maxRetries) {
const delay = this.props.retryDelay || 2000
this.retryTimeoutId = setTimeout(() => {
this.setState(prevState => ({
hasError: false,
retryCount: prevState.retryCount + 1
}))
if (this.props.onRetry) {
this.props.onRetry()
}
}, delay)
}
}
componentDidUpdate(prevProps: Props, prevState: State) {
// Reset retry count if error is cleared and component successfully rendered
if (prevState.hasError && !this.state.hasError && this.state.retryCount > 0) {
// Give it a moment to see if it stays error-free
setTimeout(() => {
if (!this.state.hasError) {
this.setState({ retryCount: 0 })
}
}, 1000)
}
}
componentWillUnmount() {
if (this.retryTimeoutId) {
clearTimeout(this.retryTimeoutId)
}
}
handleRetry = () => {
this.setState({ hasError: false })
if (this.retryTimeoutId) {
clearTimeout(this.retryTimeoutId)
}
this.setState({ hasError: false, retryCount: 0 })
if (this.props.onRetry) {
this.props.onRetry()
}
@@ -43,6 +84,11 @@ export class ErrorBoundary extends Component<Props, State> {
<h3 className="text-sm font-medium text-red-800">Something went wrong loading this section</h3>
<div className="mt-2 text-sm text-red-700">
<p>An error occurred while loading this component. Please try reloading it.</p>
{this.props.autoRetry !== false && this.state.retryCount < (this.props.maxRetries || 3) && (
<p className="mt-1 text-xs text-red-600">
Retrying automatically... (Attempt {this.state.retryCount + 1} of {(this.props.maxRetries || 3) + 1})
</p>
)}
</div>
{(this.props.showRetry !== false) && (
<div className="mt-4">

View File

@@ -3,7 +3,7 @@ import type { CreateExperimentRequest, UpdateExperimentRequest, ScheduleStatus,
import { experimentPhaseManagement, machineTypeManagement } from '../lib/supabase'
interface ExperimentFormProps {
initialData?: Partial<CreateExperimentRequest & { schedule_status: ScheduleStatus; results_status: ResultsStatus; completion_status: boolean }>
initialData?: Partial<CreateExperimentRequest & { schedule_status: ScheduleStatus; results_status: ResultsStatus; completion_status: boolean }> & { phase_id?: string | null }
onSubmit: (data: CreateExperimentRequest | UpdateExperimentRequest) => Promise<void>
onCancel: () => void
isEditing?: boolean
@@ -12,31 +12,41 @@ interface ExperimentFormProps {
}
export function ExperimentForm({ initialData, onSubmit, onCancel, isEditing = false, loading = false, phaseId }: ExperimentFormProps) {
const [formData, setFormData] = useState<CreateExperimentRequest & { schedule_status: ScheduleStatus; results_status: ResultsStatus; completion_status: boolean }>({
experiment_number: initialData?.experiment_number || 0,
reps_required: initialData?.reps_required || 1,
weight_per_repetition_lbs: (initialData as any)?.weight_per_repetition_lbs || 1,
soaking_duration_hr: initialData?.soaking_duration_hr || 0,
air_drying_time_min: initialData?.air_drying_time_min || 0,
plate_contact_frequency_hz: initialData?.plate_contact_frequency_hz || 1,
throughput_rate_pecans_sec: initialData?.throughput_rate_pecans_sec || 1,
crush_amount_in: initialData?.crush_amount_in || 0,
entry_exit_height_diff_in: initialData?.entry_exit_height_diff_in || 0,
// Meyer-specific (UI only)
motor_speed_hz: (initialData as any)?.motor_speed_hz || 1,
jig_displacement_inches: (initialData as any)?.jig_displacement_inches || 0,
spring_stiffness_nm: (initialData as any)?.spring_stiffness_nm || 1,
schedule_status: initialData?.schedule_status || 'pending schedule',
results_status: initialData?.results_status || 'valid',
completion_status: initialData?.completion_status || false,
phase_id: initialData?.phase_id || phaseId
const getInitialFormState = (d: any) => ({
experiment_number: d?.experiment_number ?? 0,
reps_required: d?.reps_required ?? 1,
weight_per_repetition_lbs: d?.weight_per_repetition_lbs ?? 1,
soaking_duration_hr: d?.soaking?.soaking_duration_hr ?? d?.soaking_duration_hr ?? 0,
air_drying_time_min: d?.airdrying?.duration_minutes ?? d?.air_drying_time_min ?? 0,
plate_contact_frequency_hz: d?.cracking?.plate_contact_frequency_hz ?? d?.plate_contact_frequency_hz ?? 1,
throughput_rate_pecans_sec: d?.cracking?.throughput_rate_pecans_sec ?? d?.throughput_rate_pecans_sec ?? 1,
crush_amount_in: d?.cracking?.crush_amount_in ?? d?.crush_amount_in ?? 0,
entry_exit_height_diff_in: d?.cracking?.entry_exit_height_diff_in ?? d?.entry_exit_height_diff_in ?? 0,
motor_speed_hz: d?.cracking?.motor_speed_hz ?? d?.motor_speed_hz ?? 1,
jig_displacement_inches: d?.cracking?.jig_displacement_inches ?? d?.jig_displacement_inches ?? 0,
spring_stiffness_nm: d?.cracking?.spring_stiffness_nm ?? d?.spring_stiffness_nm ?? 1,
schedule_status: d?.schedule_status ?? 'pending schedule',
results_status: d?.results_status ?? 'valid',
completion_status: d?.completion_status ?? false,
phase_id: d?.phase_id ?? phaseId,
ring_gap_inches: d?.shelling?.ring_gap_inches ?? d?.ring_gap_inches ?? null,
drum_rpm: d?.shelling?.drum_rpm ?? d?.drum_rpm ?? null
})
const [formData, setFormData] = useState<CreateExperimentRequest & { schedule_status: ScheduleStatus; results_status: ResultsStatus; completion_status: boolean }>(() => getInitialFormState(initialData))
const [errors, setErrors] = useState<Record<string, string>>({})
const [phase, setPhase] = useState<ExperimentPhase | null>(null)
const [crackingMachine, setCrackingMachine] = useState<MachineType | null>(null)
const [metaLoading, setMetaLoading] = useState<boolean>(false)
// When initialData loads with phase config (edit mode), sync form state
useEffect(() => {
if ((initialData as any)?.id) {
setFormData(prev => ({ ...prev, ...getInitialFormState(initialData) }))
}
}, [initialData])
useEffect(() => {
const loadMeta = async () => {
if (!phaseId) return
@@ -76,11 +86,11 @@ export function ExperimentForm({ initialData, onSubmit, onCancel, isEditing = fa
}
if (formData.soaking_duration_hr < 0) {
if ((formData.soaking_duration_hr ?? 0) < 0) {
newErrors.soaking_duration_hr = 'Soaking duration cannot be negative'
}
if (formData.air_drying_time_min < 0) {
if ((formData.air_drying_time_min ?? 0) < 0) {
newErrors.air_drying_time_min = 'Air drying time cannot be negative'
}
@@ -93,7 +103,7 @@ export function ExperimentForm({ initialData, onSubmit, onCancel, isEditing = fa
if (!formData.throughput_rate_pecans_sec || formData.throughput_rate_pecans_sec <= 0) {
newErrors.throughput_rate_pecans_sec = 'Throughput rate must be positive'
}
if (formData.crush_amount_in < 0) {
if ((formData.crush_amount_in ?? 0) < 0) {
newErrors.crush_amount_in = 'Crush amount cannot be negative'
}
}
@@ -110,6 +120,16 @@ export function ExperimentForm({ initialData, onSubmit, onCancel, isEditing = fa
}
}
// Shelling: if provided, must be positive
if (phase?.has_shelling) {
if (formData.ring_gap_inches != null && (typeof formData.ring_gap_inches !== 'number' || formData.ring_gap_inches <= 0)) {
newErrors.ring_gap_inches = 'Ring gap must be positive'
}
if (formData.drum_rpm != null && (typeof formData.drum_rpm !== 'number' || formData.drum_rpm <= 0)) {
newErrors.drum_rpm = 'Drum RPM must be positive'
}
}
setErrors(newErrors)
return Object.keys(newErrors).length === 0
}
@@ -122,14 +142,25 @@ export function ExperimentForm({ initialData, onSubmit, onCancel, isEditing = fa
}
try {
// Prepare data for submission
// Prepare data: include all phase params so they are stored in experiment_soaking, experiment_airdrying, experiment_cracking, experiment_shelling
const submitData = isEditing ? formData : {
experiment_number: formData.experiment_number,
reps_required: formData.reps_required,
weight_per_repetition_lbs: formData.weight_per_repetition_lbs,
results_status: formData.results_status,
completion_status: formData.completion_status,
phase_id: formData.phase_id
phase_id: formData.phase_id,
soaking_duration_hr: formData.soaking_duration_hr,
air_drying_time_min: formData.air_drying_time_min,
plate_contact_frequency_hz: formData.plate_contact_frequency_hz,
throughput_rate_pecans_sec: formData.throughput_rate_pecans_sec,
crush_amount_in: formData.crush_amount_in,
entry_exit_height_diff_in: formData.entry_exit_height_diff_in,
motor_speed_hz: (formData as any).motor_speed_hz,
jig_displacement_inches: (formData as any).jig_displacement_inches,
spring_stiffness_nm: (formData as any).spring_stiffness_nm,
ring_gap_inches: formData.ring_gap_inches ?? undefined,
drum_rpm: formData.drum_rpm ?? undefined
}
await onSubmit(submitData)
@@ -138,7 +169,7 @@ export function ExperimentForm({ initialData, onSubmit, onCancel, isEditing = fa
}
}
const handleInputChange = (field: keyof typeof formData, value: string | number | boolean) => {
const handleInputChange = (field: keyof typeof formData, value: string | number | boolean | null | undefined) => {
setFormData(prev => ({
...prev,
[field]: value
@@ -441,18 +472,40 @@ export function ExperimentForm({ initialData, onSubmit, onCancel, isEditing = fa
<h3 className="text-lg font-medium text-gray-900 mb-4">Shelling</h3>
<div className="grid grid-cols-1 md:grid-cols-2 gap-6">
<div>
<label className="block text-sm font-medium text-gray-700 mb-2">
Shelling Start Offset (minutes)
<label htmlFor="ring_gap_inches" className="block text-sm font-medium text-gray-700 mb-2">
Ring gap (inches)
</label>
<input
type="number"
value={(formData as any).shelling_start_offset_min || 0}
onChange={(e) => handleInputChange('shelling_start_offset_min' as any, parseInt(e.target.value) || 0)}
className="max-w-xs px-3 py-2 border border-gray-300 rounded-lg focus:ring-2 focus:ring-blue-500 focus:border-blue-500 transition-colors text-sm"
placeholder="0"
id="ring_gap_inches"
value={formData.ring_gap_inches ?? ''}
onChange={(e) => handleInputChange('ring_gap_inches' as any, e.target.value === '' ? null : parseFloat(e.target.value))}
className={`max-w-xs px-3 py-2 border rounded-lg focus:ring-2 focus:ring-blue-500 focus:border-blue-500 transition-colors text-sm ${errors.ring_gap_inches ? 'border-red-300' : 'border-gray-300'}`}
placeholder="e.g. 0.25"
min="0"
step="0.01"
/>
{errors.ring_gap_inches && (
<p className="mt-1 text-sm text-red-600">{errors.ring_gap_inches}</p>
)}
</div>
<div>
<label htmlFor="drum_rpm" className="block text-sm font-medium text-gray-700 mb-2">
Drum RPM
</label>
<input
type="number"
id="drum_rpm"
value={formData.drum_rpm ?? ''}
onChange={(e) => handleInputChange('drum_rpm' as any, e.target.value === '' ? null : parseInt(e.target.value, 10))}
className={`max-w-xs px-3 py-2 border rounded-lg focus:ring-2 focus:ring-blue-500 focus:border-blue-500 transition-colors text-sm ${errors.drum_rpm ? 'border-red-300' : 'border-gray-300'}`}
placeholder="e.g. 300"
min="1"
step="1"
/>
{errors.drum_rpm && (
<p className="mt-1 text-sm text-red-600">{errors.drum_rpm}</p>
)}
</div>
</div>
</div>

View File

@@ -1,4 +1,4 @@
import { useState } from 'react'
import { useState, useEffect } from 'react'
import { ExperimentForm } from './ExperimentForm'
import { experimentManagement } from '../lib/supabase'
import type { Experiment, CreateExperimentRequest, UpdateExperimentRequest } from '../lib/supabase'
@@ -13,9 +13,20 @@ interface ExperimentModalProps {
export function ExperimentModal({ experiment, onClose, onExperimentSaved, phaseId }: ExperimentModalProps) {
const [loading, setLoading] = useState(false)
const [error, setError] = useState<string | null>(null)
const [initialData, setInitialData] = useState<Experiment | (Experiment & { soaking?: any; airdrying?: any; cracking?: any; shelling?: any }) | undefined>(experiment ?? undefined)
const isEditing = !!experiment
useEffect(() => {
if (experiment) {
experimentManagement.getExperimentWithPhaseConfig(experiment.id)
.then((data) => setInitialData(data ?? experiment))
.catch(() => setInitialData(experiment))
} else {
setInitialData(undefined)
}
}, [experiment?.id])
const handleSubmit = async (data: CreateExperimentRequest | UpdateExperimentRequest) => {
setError(null)
setLoading(true)
@@ -24,22 +35,24 @@ export function ExperimentModal({ experiment, onClose, onExperimentSaved, phaseI
let savedExperiment: Experiment
if (isEditing && experiment) {
// Check if experiment number is unique (excluding current experiment)
// Check if experiment number is unique within this phase (excluding current experiment)
if ('experiment_number' in data && data.experiment_number !== undefined && data.experiment_number !== experiment.experiment_number) {
const isUnique = await experimentManagement.isExperimentNumberUnique(data.experiment_number, experiment.id)
const phaseIdToCheck = data.phase_id ?? experiment.phase_id ?? phaseId
const isUnique = await experimentManagement.isExperimentNumberUnique(data.experiment_number, phaseIdToCheck ?? undefined, experiment.id)
if (!isUnique) {
setError('Experiment number already exists. Please choose a different number.')
setError('Experiment number already exists in this phase. Please choose a different number.')
return
}
}
savedExperiment = await experimentManagement.updateExperiment(experiment.id, data)
} else {
// Check if experiment number is unique for new experiments
// Check if experiment number is unique within this phase for new experiments
const createData = data as CreateExperimentRequest
const isUnique = await experimentManagement.isExperimentNumberUnique(createData.experiment_number)
const phaseIdToCheck = createData.phase_id ?? phaseId
const isUnique = await experimentManagement.isExperimentNumberUnique(createData.experiment_number, phaseIdToCheck ?? undefined)
if (!isUnique) {
setError('Experiment number already exists. Please choose a different number.')
setError('Experiment number already exists in this phase. Please choose a different number.')
return
}
@@ -115,7 +128,7 @@ export function ExperimentModal({ experiment, onClose, onExperimentSaved, phaseI
{/* Form */}
<ExperimentForm
initialData={experiment}
initialData={initialData ? { ...initialData, phase_id: initialData.phase_id ?? undefined } : undefined}
onSubmit={handleSubmit}
onCancel={handleCancel}
isEditing={isEditing}

View File

@@ -31,8 +31,8 @@ export function ExperimentPhases({ onPhaseSelect }: ExperimentPhasesProps) {
setPhases(phasesData)
setCurrentUser(userData)
} catch (err: any) {
setError(err.message || 'Failed to load experiment phases')
console.error('Load experiment phases error:', err)
setError(err.message || 'Failed to load experiment books')
console.error('Load experiment books error:', err)
} finally {
setLoading(false)
}
@@ -61,16 +61,16 @@ export function ExperimentPhases({ onPhaseSelect }: ExperimentPhasesProps) {
<div className="mb-8">
<div className="flex justify-between items-center">
<div>
<h1 className="text-3xl font-bold text-gray-900 dark:text-white">Experiment Phases</h1>
<p className="mt-2 text-gray-600 dark:text-gray-400">Select an experiment phase to view and manage its experiments</p>
<p className="mt-2 text-gray-600 dark:text-gray-400">Experiment phases help organize experiments into logical groups for easier navigation and management.</p>
<h1 className="text-3xl font-bold text-gray-900 dark:text-white">Experiment Books</h1>
<p className="mt-2 text-gray-600 dark:text-gray-400">Select an experiment book to view and manage its experiments</p>
<p className="mt-2 text-gray-600 dark:text-gray-400">Experiment books help organize experiments into logical groups for easier navigation and management.</p>
</div>
{canManagePhases && (
<button
onClick={() => setShowCreateModal(true)}
className="inline-flex items-center px-4 py-2 border border-transparent text-sm font-medium rounded-md text-white bg-blue-600 hover:bg-blue-700 focus:outline-none focus:ring-2 focus:ring-offset-2 focus:ring-blue-500"
>
New Phase
New Book
</button>
)}
</div>
@@ -162,9 +162,9 @@ export function ExperimentPhases({ onPhaseSelect }: ExperimentPhasesProps) {
<svg className="mx-auto h-12 w-12 text-gray-400 dark:text-gray-500" fill="none" viewBox="0 0 24 24" stroke="currentColor">
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M19.428 15.428a2 2 0 00-1.022-.547l-2.387-.477a6 6 0 00-3.86.517l-.318.158a6 6 0 01-3.86.517L6.05 15.21a2 2 0 00-1.806.547M8 4h8l-1 1v5.172a2 2 0 00.586 1.414l5 5c1.26 1.26.367 3.414-1.415 3.414H4.828c-1.782 0-2.674-2.154-1.414-3.414l5-5A2 2 0 009 10.172V5L8 4z" />
</svg>
<h3 className="mt-2 text-sm font-medium text-gray-900 dark:text-white">No experiment phases found</h3>
<h3 className="mt-2 text-sm font-medium text-gray-900 dark:text-white">No experiment books found</h3>
<p className="mt-1 text-sm text-gray-500 dark:text-gray-400">
Get started by creating your first experiment phase.
Get started by creating your first experiment book.
</p>
{canManagePhases && (
<div className="mt-6">
@@ -172,7 +172,7 @@ export function ExperimentPhases({ onPhaseSelect }: ExperimentPhasesProps) {
onClick={() => setShowCreateModal(true)}
className="inline-flex items-center px-4 py-2 border border-transparent shadow-sm text-sm font-medium rounded-md text-white bg-blue-600 hover:bg-blue-700 focus:outline-none focus:ring-2 focus:ring-offset-2 focus:ring-blue-500"
>
Create First Phase
Create First Book
</button>
</div>
)}

View File

@@ -1,5 +1,6 @@
import { useState } from 'react'
import { supabase } from '../lib/supabase'
import { MicrosoftIcon } from './MicrosoftIcon'
interface LoginProps {
onLoginSuccess: () => void
@@ -10,6 +11,7 @@ export function Login({ onLoginSuccess }: LoginProps) {
const [password, setPassword] = useState('')
const [loading, setLoading] = useState(false)
const [error, setError] = useState<string | null>(null)
const enableMicrosoftLogin = import.meta.env.VITE_ENABLE_MICROSOFT_LOGIN === 'true'
const handleLogin = async (e: React.FormEvent) => {
e.preventDefault()
@@ -35,6 +37,32 @@ export function Login({ onLoginSuccess }: LoginProps) {
}
}
const handleMicrosoftLogin = async () => {
setLoading(true)
setError(null)
try {
const { error } = await supabase.auth.signInWithOAuth({
provider: 'azure',
options: {
scopes: 'email openid profile',
redirectTo: `${window.location.origin}/`,
},
})
if (error) {
setError(error.message)
setLoading(false)
}
// If successful, user will be redirected to Microsoft login
// and then back to the app, so we don't stop loading here
} catch (err) {
setError('An unexpected error occurred during Microsoft login')
console.error('Microsoft login error:', err)
setLoading(false)
}
}
return (
<div className="min-h-screen flex items-center justify-center bg-gray-50 dark:bg-gray-900">
<div className="max-w-md w-full space-y-8">
@@ -108,6 +136,33 @@ export function Login({ onLoginSuccess }: LoginProps) {
</button>
</div>
</form>
{enableMicrosoftLogin && (
<>
<div className="relative">
<div className="absolute inset-0 flex items-center">
<div className="w-full border-t border-gray-300 dark:border-gray-700" />
</div>
<div className="relative flex justify-center text-sm">
<span className="px-2 bg-gray-50 dark:bg-gray-900 text-gray-500 dark:text-gray-400">
Or continue with
</span>
</div>
</div>
<div>
<button
type="button"
onClick={handleMicrosoftLogin}
disabled={loading}
className="w-full flex items-center justify-center gap-3 py-2 px-4 border border-gray-300 dark:border-gray-700 rounded-md shadow-sm text-sm font-medium text-gray-700 dark:text-gray-300 bg-white dark:bg-gray-800 hover:bg-gray-50 dark:hover:bg-gray-700 focus:outline-none focus:ring-2 focus:ring-offset-2 focus:ring-indigo-500 disabled:opacity-50 disabled:cursor-not-allowed"
>
<MicrosoftIcon className="w-5 h-5" />
<span>Sign in with Microsoft</span>
</button>
</div>
</>
)}
</div>
</div>
)

View File

@@ -0,0 +1,11 @@
export function MicrosoftIcon({ className = "w-5 h-5" }: { className?: string }) {
return (
<svg className={className} viewBox="0 0 23 23" xmlns="http://www.w3.org/2000/svg">
<path fill="#f3f3f3" d="M0 0h23v23H0z" />
<path fill="#f35325" d="M1 1h10v10H1z" />
<path fill="#81bc06" d="M12 1h10v10H12z" />
<path fill="#05a6f0" d="M1 12h10v10H1z" />
<path fill="#ffba08" d="M12 12h10v10H12z" />
</svg>
)
}

View File

@@ -193,7 +193,7 @@ export function PhaseExperiments({ phase, onBack }: PhaseExperimentsProps) {
<svg className="w-5 h-5 mr-2" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M15 19l-7-7 7-7" />
</svg>
Back to Phases
Back to Books
</button>
</div>
@@ -203,7 +203,7 @@ export function PhaseExperiments({ phase, onBack }: PhaseExperimentsProps) {
{phase.description && (
<p className="mt-2 text-gray-600">{phase.description}</p>
)}
<p className="mt-2 text-gray-600">Manage experiments within this phase</p>
<p className="mt-2 text-gray-600">Manage experiments within this book</p>
</div>
{canManageExperiments && (
<button
@@ -417,9 +417,9 @@ export function PhaseExperiments({ phase, onBack }: PhaseExperimentsProps) {
<svg className="mx-auto h-12 w-12 text-gray-400" fill="none" viewBox="0 0 24 24" stroke="currentColor">
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M9 5H7a2 2 0 00-2 2v10a2 2 0 002 2h8a2 2 0 002-2V7a2 2 0 00-2-2H5a2 2 0 00-2 2v12a2 2 0 002 2z" />
</svg>
<h3 className="mt-2 text-sm font-medium text-gray-900">No experiments found in this phase</h3>
<h3 className="mt-2 text-sm font-medium text-gray-900">No experiments found in this book</h3>
<p className="mt-1 text-sm text-gray-500">
Get started by creating your first experiment in this phase.
Get started by creating your first experiment in this book.
</p>
{canManageExperiments && (
<div className="mt-6">

View File

@@ -147,7 +147,7 @@ export function PhaseForm({ onSubmit, onCancel, loading = false }: PhaseFormProp
onChange={(e) => handleInputChange('description', e.target.value)}
rows={3}
className="w-full px-3 py-2 border border-gray-300 rounded-md shadow-sm focus:outline-none focus:ring-2 focus:ring-blue-500 focus:border-blue-500"
placeholder="Optional description of this experiment phase"
placeholder="Optional description of this experiment book"
disabled={loading}
/>
</div>

View File

@@ -21,7 +21,7 @@ export function PhaseModal({ onClose, onPhaseCreated }: PhaseModalProps) {
onPhaseCreated(newPhase)
onClose()
} catch (err: any) {
setError(err.message || 'Failed to create experiment phase')
setError(err.message || 'Failed to create experiment book')
console.error('Create phase error:', err)
} finally {
setLoading(false)
@@ -35,7 +35,7 @@ export function PhaseModal({ onClose, onPhaseCreated }: PhaseModalProps) {
{/* Header */}
<div className="flex items-center justify-between mb-6">
<h3 className="text-lg font-medium text-gray-900">
Create New Experiment Phase
Create New Experiment Book
</h3>
<button
onClick={onClose}

View File

@@ -7,8 +7,6 @@ interface SidebarProps {
onViewChange: (view: string) => void
isExpanded?: boolean
isMobileOpen?: boolean
isHovered?: boolean
setIsHovered?: (hovered: boolean) => void
}
interface MenuItem {
@@ -23,10 +21,8 @@ export function Sidebar({
user,
currentView,
onViewChange,
isExpanded = true,
isMobileOpen = false,
isHovered = false,
setIsHovered
isExpanded = false,
isMobileOpen = false
}: SidebarProps) {
const [openSubmenu, setOpenSubmenu] = useState<number | null>(null)
const [subMenuHeight, setSubMenuHeight] = useState<Record<string, number>>({})
@@ -170,7 +166,7 @@ export function Sidebar({
className={`menu-item group ${openSubmenu === index
? "menu-item-active"
: "menu-item-inactive"
} cursor-pointer ${!isExpanded && !isHovered
} cursor-pointer ${!isExpanded
? "lg:justify-center"
: "lg:justify-start"
}`}
@@ -183,10 +179,10 @@ export function Sidebar({
>
{nav.icon}
</span>
{(isExpanded || isHovered || isMobileOpen) && (
{(isExpanded || isMobileOpen) && (
<span className="menu-item-text">{nav.name}</span>
)}
{(isExpanded || isHovered || isMobileOpen) && (
{(isExpanded || isMobileOpen) && (
<svg
className={`ml-auto w-5 h-5 transition-transform duration-200 ${openSubmenu === index
? "rotate-180 text-brand-500"
@@ -214,12 +210,12 @@ export function Sidebar({
>
{nav.icon}
</span>
{(isExpanded || isHovered || isMobileOpen) && (
{(isExpanded || isMobileOpen) && (
<span className="menu-item-text">{nav.name}</span>
)}
</button>
)}
{nav.subItems && (isExpanded || isHovered || isMobileOpen) && (
{nav.subItems && (isExpanded || isMobileOpen) && (
<div
ref={(el) => {
subMenuRefs.current[`submenu-${index}`] = el
@@ -265,21 +261,17 @@ export function Sidebar({
className={`fixed mt-16 flex flex-col lg:mt-0 top-0 px-5 left-0 bg-white dark:bg-gray-900 dark:border-gray-800 text-gray-900 h-screen transition-all duration-300 ease-in-out z-50 border-r border-gray-200
${isExpanded || isMobileOpen
? "w-[290px]"
: isHovered
? "w-[290px]"
: "w-[90px]"
: "w-[90px]"
}
${isMobileOpen ? "translate-x-0" : "-translate-x-full"}
lg:translate-x-0`}
onMouseEnter={() => !isExpanded && setIsHovered && setIsHovered(true)}
onMouseLeave={() => setIsHovered && setIsHovered(false)}
>
<div
className={`py-8 flex ${!isExpanded && !isHovered ? "lg:justify-center" : "justify-start"
className={`py-8 flex ${!isExpanded ? "lg:justify-center" : "justify-start"
}`}
>
<div>
{isExpanded || isHovered || isMobileOpen ? (
{isExpanded || isMobileOpen ? (
<>
<h1 className="text-xl font-bold text-gray-800 dark:text-white/90">Pecan Experiments</h1>
<p className="text-sm text-gray-500 dark:text-gray-400">Research Dashboard</p>
@@ -297,12 +289,12 @@ export function Sidebar({
<div className="flex flex-col gap-4">
<div>
<h2
className={`mb-4 text-xs uppercase flex leading-[20px] text-gray-400 ${!isExpanded && !isHovered
className={`mb-4 text-xs uppercase flex leading-[20px] text-gray-400 ${!isExpanded
? "lg:justify-center"
: "justify-start"
}`}
>
{isExpanded || isHovered || isMobileOpen ? (
{isExpanded || isMobileOpen ? (
"Menu"
) : (
<svg className="size-6" fill="none" stroke="currentColor" viewBox="0 0 24 24">

View File

@@ -60,6 +60,8 @@ export interface Experiment {
airdrying_id?: string | null
cracking_id?: string | null
shelling_id?: string | null
ring_gap_inches?: number | null
drum_rpm?: number | null
created_at: string
updated_at: string
created_by: string
@@ -170,6 +172,51 @@ export interface UpdateExperimentPhaseRequest {
has_shelling?: boolean
}
// Experiment-level phase config (one row per experiment per phase; stored in experiment_soaking, experiment_airdrying, experiment_cracking, experiment_shelling)
export interface ExperimentSoakingConfig {
id: string
experiment_id: string
soaking_duration_hr: number
created_at: string
updated_at: string
created_by: string
}
export interface ExperimentAirdryingConfig {
id: string
experiment_id: string
duration_minutes: number
created_at: string
updated_at: string
created_by: string
}
export interface ExperimentCrackingConfig {
id: string
experiment_id: string
machine_type_id: string
plate_contact_frequency_hz?: number | null
throughput_rate_pecans_sec?: number | null
crush_amount_in?: number | null
entry_exit_height_diff_in?: number | null
motor_speed_hz?: number | null
jig_displacement_inches?: number | null
spring_stiffness_nm?: number | null
created_at: string
updated_at: string
created_by: string
}
export interface ExperimentShellingConfig {
id: string
experiment_id: string
ring_gap_inches?: number | null
drum_rpm?: number | null
created_at: string
updated_at: string
created_by: string
}
export interface CreateExperimentRequest {
experiment_number: number
reps_required: number
@@ -177,6 +224,19 @@ export interface CreateExperimentRequest {
results_status?: ResultsStatus
completion_status?: boolean
phase_id?: string
// Phase config (stored in experiment_soaking, experiment_airdrying, experiment_cracking, experiment_shelling)
soaking_duration_hr?: number
air_drying_time_min?: number
// Cracking: machine_type comes from book; params below are JC or Meyer specific
plate_contact_frequency_hz?: number
throughput_rate_pecans_sec?: number
crush_amount_in?: number
entry_exit_height_diff_in?: number
motor_speed_hz?: number
jig_displacement_inches?: number
spring_stiffness_nm?: number
ring_gap_inches?: number | null
drum_rpm?: number | null
}
export interface UpdateExperimentRequest {
@@ -186,6 +246,17 @@ export interface UpdateExperimentRequest {
results_status?: ResultsStatus
completion_status?: boolean
phase_id?: string
soaking_duration_hr?: number
air_drying_time_min?: number
plate_contact_frequency_hz?: number
throughput_rate_pecans_sec?: number
crush_amount_in?: number
entry_exit_height_diff_in?: number
motor_speed_hz?: number
jig_displacement_inches?: number
spring_stiffness_nm?: number
ring_gap_inches?: number | null
drum_rpm?: number | null
}
export interface CreateRepetitionRequest {
@@ -557,15 +628,69 @@ export const userManagement = {
if (error) throw error
return data
},
// Sync OAuth user - ensures user profile exists for OAuth-authenticated users
async syncOAuthUser(): Promise<void> {
try {
const { data: { user: authUser }, error: authError } = await supabase.auth.getUser()
if (authError || !authUser) {
console.warn('No authenticated user found for OAuth sync')
return
}
// Check if user profile already exists
const { data: existingProfile, error: checkError } = await supabase
.from('user_profiles')
.select('id')
.eq('id', authUser.id)
.single()
// If profile already exists, no need to create it
if (existingProfile && !checkError) {
console.log('User profile already exists for user:', authUser.id)
return
}
// If error is not "no rows returned", it's a real error
if (checkError && checkError.code !== 'PGRST116') {
console.error('Error checking for existing profile:', checkError)
return
}
// Create user profile for new OAuth user
const { error: insertError } = await supabase
.from('user_profiles')
.insert({
id: authUser.id,
email: authUser.email || '',
status: 'active'
})
if (insertError) {
// Ignore "duplicate key value" errors in case of race condition
if (insertError.code === '23505') {
console.log('User profile was already created (race condition handled)')
return
}
console.error('Error creating user profile for OAuth user:', insertError)
return
}
console.log('Successfully created user profile for OAuth user:', authUser.id)
} catch (error) {
console.error('Unexpected error in syncOAuthUser:', error)
}
}
}
// Experiment phase management utility functions
// Experiment book management (table: experiment_books)
export const experimentPhaseManagement = {
// Get all experiment phases
// Get all experiment books
async getAllExperimentPhases(): Promise<ExperimentPhase[]> {
const { data, error } = await supabase
.from('experiment_phases')
.from('experiment_books')
.select('*')
.order('created_at', { ascending: false })
@@ -573,10 +698,10 @@ export const experimentPhaseManagement = {
return data
},
// Get experiment phase by ID
// Get experiment book by ID
async getExperimentPhaseById(id: string): Promise<ExperimentPhase | null> {
const { data, error } = await supabase
.from('experiment_phases')
.from('experiment_books')
.select('*')
.eq('id', id)
.single()
@@ -588,13 +713,13 @@ export const experimentPhaseManagement = {
return data
},
// Create a new experiment phase
// Create a new experiment book
async createExperimentPhase(phaseData: CreateExperimentPhaseRequest): Promise<ExperimentPhase> {
const { data: { user }, error: authError } = await supabase.auth.getUser()
if (authError || !user) throw new Error('User not authenticated')
const { data, error } = await supabase
.from('experiment_phases')
.from('experiment_books')
.insert({
...phaseData,
created_by: user.id
@@ -606,10 +731,10 @@ export const experimentPhaseManagement = {
return data
},
// Update an experiment phase
// Update an experiment book
async updateExperimentPhase(id: string, updates: UpdateExperimentPhaseRequest): Promise<ExperimentPhase> {
const { data, error } = await supabase
.from('experiment_phases')
.from('experiment_books')
.update(updates)
.eq('id', id)
.select()
@@ -619,10 +744,10 @@ export const experimentPhaseManagement = {
return data
},
// Delete an experiment phase
// Delete an experiment book
async deleteExperimentPhase(id: string): Promise<void> {
const { error } = await supabase
.from('experiment_phases')
.from('experiment_books')
.delete()
.eq('id', id)
@@ -670,33 +795,170 @@ export const experimentManagement = {
return data
},
// Create a new experiment
// Get experiment with its phase config (soaking, airdrying, cracking, shelling) for edit form
async getExperimentWithPhaseConfig(id: string): Promise<(Experiment & {
soaking?: ExperimentSoakingConfig | null
airdrying?: ExperimentAirdryingConfig | null
cracking?: ExperimentCrackingConfig | null
shelling?: ExperimentShellingConfig | null
}) | null> {
const experiment = await this.getExperimentById(id)
if (!experiment) return null
const [soakingRes, airdryingRes, crackingRes, shellingRes] = await Promise.all([
supabase.from('experiment_soaking').select('*').eq('experiment_id', id).maybeSingle(),
supabase.from('experiment_airdrying').select('*').eq('experiment_id', id).maybeSingle(),
supabase.from('experiment_cracking').select('*').eq('experiment_id', id).maybeSingle(),
supabase.from('experiment_shelling').select('*').eq('experiment_id', id).maybeSingle()
])
if (soakingRes.error) throw soakingRes.error
if (airdryingRes.error) throw airdryingRes.error
if (crackingRes.error) throw crackingRes.error
if (shellingRes.error) throw shellingRes.error
return {
...experiment,
soaking: soakingRes.data ?? null,
airdrying: airdryingRes.data ?? null,
cracking: crackingRes.data ?? null,
shelling: shellingRes.data ?? null
}
},
// Create a new experiment and its phase config rows (experiment_soaking, experiment_airdrying, experiment_cracking, experiment_shelling)
async createExperiment(experimentData: CreateExperimentRequest): Promise<Experiment> {
const { data: { user }, error: authError } = await supabase.auth.getUser()
if (authError || !user) throw new Error('User not authenticated')
const { data, error } = await supabase
const phaseId = experimentData.phase_id
const corePayload = {
experiment_number: experimentData.experiment_number,
reps_required: experimentData.reps_required,
weight_per_repetition_lbs: experimentData.weight_per_repetition_lbs,
results_status: experimentData.results_status ?? 'valid',
completion_status: experimentData.completion_status ?? false,
phase_id: phaseId,
created_by: user.id
}
// phase_id required for phase configs
if (!phaseId) {
const { data, error } = await supabase.from('experiments').insert(corePayload).select().single()
if (error) throw error
return data
}
const { data: experiment, error } = await supabase
.from('experiments')
.insert({
...experimentData,
created_by: user.id
})
.insert(corePayload)
.select()
.single()
if (error) throw error
return data
const book = await experimentPhaseManagement.getExperimentPhaseById(phaseId)
if (!book) return experiment
if (book.has_soaking && experimentData.soaking_duration_hr != null) {
await supabase.from('experiment_soaking').insert({
experiment_id: experiment.id,
soaking_duration_hr: experimentData.soaking_duration_hr,
created_by: user.id
})
}
if (book.has_airdrying && experimentData.air_drying_time_min != null) {
await supabase.from('experiment_airdrying').insert({
experiment_id: experiment.id,
duration_minutes: experimentData.air_drying_time_min,
created_by: user.id
})
}
if (book.has_cracking && book.cracking_machine_type_id) {
const crackPayload: Record<string, unknown> = {
experiment_id: experiment.id,
machine_type_id: book.cracking_machine_type_id,
created_by: user.id
}
if (experimentData.plate_contact_frequency_hz != null) crackPayload.plate_contact_frequency_hz = experimentData.plate_contact_frequency_hz
if (experimentData.throughput_rate_pecans_sec != null) crackPayload.throughput_rate_pecans_sec = experimentData.throughput_rate_pecans_sec
if (experimentData.crush_amount_in != null) crackPayload.crush_amount_in = experimentData.crush_amount_in
if (experimentData.entry_exit_height_diff_in != null) crackPayload.entry_exit_height_diff_in = experimentData.entry_exit_height_diff_in
if (experimentData.motor_speed_hz != null) crackPayload.motor_speed_hz = experimentData.motor_speed_hz
if (experimentData.jig_displacement_inches != null) crackPayload.jig_displacement_inches = experimentData.jig_displacement_inches
if (experimentData.spring_stiffness_nm != null) crackPayload.spring_stiffness_nm = experimentData.spring_stiffness_nm
await supabase.from('experiment_cracking').insert(crackPayload)
}
if (book.has_shelling && (experimentData.ring_gap_inches != null || experimentData.drum_rpm != null)) {
await supabase.from('experiment_shelling').insert({
experiment_id: experiment.id,
ring_gap_inches: experimentData.ring_gap_inches ?? null,
drum_rpm: experimentData.drum_rpm ?? null,
created_by: user.id
})
}
return experiment
},
// Update an experiment
// Update an experiment and upsert its phase config rows
async updateExperiment(id: string, updates: UpdateExperimentRequest): Promise<Experiment> {
const { data, error } = await supabase
.from('experiments')
.update(updates)
.eq('id', id)
.select()
.single()
const { data: { user }, error: authError } = await supabase.auth.getUser()
if (authError || !user) throw new Error('User not authenticated')
const coreKeys = ['experiment_number', 'reps_required', 'weight_per_repetition_lbs', 'results_status', 'completion_status', 'phase_id'] as const
const coreUpdates: Partial<UpdateExperimentRequest> = {}
for (const k of coreKeys) {
if (updates[k] !== undefined) coreUpdates[k] = updates[k]
}
if (Object.keys(coreUpdates).length > 0) {
const { data, error } = await supabase.from('experiments').update(coreUpdates).eq('id', id).select().single()
if (error) throw error
}
if (updates.soaking_duration_hr !== undefined) {
const { data: existing } = await supabase.from('experiment_soaking').select('id').eq('experiment_id', id).maybeSingle()
if (existing) {
await supabase.from('experiment_soaking').update({ soaking_duration_hr: updates.soaking_duration_hr, updated_at: new Date().toISOString() }).eq('experiment_id', id)
} else {
await supabase.from('experiment_soaking').insert({ experiment_id: id, soaking_duration_hr: updates.soaking_duration_hr, created_by: user.id })
}
}
if (updates.air_drying_time_min !== undefined) {
const { data: existing } = await supabase.from('experiment_airdrying').select('id').eq('experiment_id', id).maybeSingle()
if (existing) {
await supabase.from('experiment_airdrying').update({ duration_minutes: updates.air_drying_time_min, updated_at: new Date().toISOString() }).eq('experiment_id', id)
} else {
await supabase.from('experiment_airdrying').insert({ experiment_id: id, duration_minutes: updates.air_drying_time_min, created_by: user.id })
}
}
const crackKeys = ['plate_contact_frequency_hz', 'throughput_rate_pecans_sec', 'crush_amount_in', 'entry_exit_height_diff_in', 'motor_speed_hz', 'jig_displacement_inches', 'spring_stiffness_nm'] as const
const hasCrackUpdates = crackKeys.some(k => updates[k] !== undefined)
if (hasCrackUpdates) {
const { data: existing } = await supabase.from('experiment_cracking').select('id').eq('experiment_id', id).maybeSingle()
const crackPayload: Record<string, unknown> = {}
crackKeys.forEach(k => { if (updates[k] !== undefined) crackPayload[k] = updates[k] })
if (Object.keys(crackPayload).length > 0) {
if (existing) {
await supabase.from('experiment_cracking').update({ ...crackPayload, updated_at: new Date().toISOString() }).eq('experiment_id', id)
} else {
const exp = await this.getExperimentById(id)
const book = exp?.phase_id ? await experimentPhaseManagement.getExperimentPhaseById(exp.phase_id) : null
if (book?.has_cracking && book.cracking_machine_type_id) {
await supabase.from('experiment_cracking').insert({ experiment_id: id, machine_type_id: book.cracking_machine_type_id, ...crackPayload, created_by: user.id })
}
}
}
}
if (updates.ring_gap_inches !== undefined || updates.drum_rpm !== undefined) {
const { data: existing } = await supabase.from('experiment_shelling').select('id').eq('experiment_id', id).maybeSingle()
const shellPayload = { ring_gap_inches: updates.ring_gap_inches ?? null, drum_rpm: updates.drum_rpm ?? null }
if (existing) {
await supabase.from('experiment_shelling').update({ ...shellPayload, updated_at: new Date().toISOString() }).eq('experiment_id', id)
} else {
await supabase.from('experiment_shelling').insert({ experiment_id: id, ...shellPayload, created_by: user.id })
}
}
const { data, error } = await supabase.from('experiments').select('*').eq('id', id).single()
if (error) throw error
return data
},
@@ -739,13 +1001,16 @@ export const experimentManagement = {
// Check if experiment number is unique
async isExperimentNumberUnique(experimentNumber: number, excludeId?: string): Promise<boolean> {
// Check if experiment number is unique within the same phase (experiment_number + phase_id must be unique)
async isExperimentNumberUnique(experimentNumber: number, phaseId?: string, excludeId?: string): Promise<boolean> {
let query = supabase
.from('experiments')
.select('id')
.eq('experiment_number', experimentNumber)
if (phaseId) {
query = query.eq('phase_id', phaseId)
}
if (excludeId) {
query = query.neq('id', excludeId)
}

View File

@@ -4,6 +4,7 @@ interface ImportMetaEnv {
readonly VITE_SUPABASE_URL: string;
readonly VITE_SUPABASE_ANON_KEY: string;
readonly VITE_VISION_API_URL?: string; // optional; defaults to "/api" via vite proxy
readonly VITE_ENABLE_MICROSOFT_LOGIN?: string; // optional; enable Microsoft Entra authentication
}
interface ImportMeta {

View File

@@ -0,0 +1,46 @@
-- OAuth User Synchronization
-- This migration adds functionality to automatically create user profiles when users sign up via OAuth
-- =============================================
-- 1. CREATE FUNCTION FOR OAUTH USER AUTO-PROFILE CREATION
-- =============================================
CREATE OR REPLACE FUNCTION public.handle_new_oauth_user()
RETURNS TRIGGER AS $$
BEGIN
-- Check if user profile already exists
IF NOT EXISTS (
SELECT 1 FROM public.user_profiles WHERE id = NEW.id
) THEN
-- Create user profile with default active status
INSERT INTO public.user_profiles (id, email, status)
VALUES (
NEW.id,
NEW.email,
'active'
)
ON CONFLICT (id) DO NOTHING;
END IF;
RETURN NEW;
END;
$$ LANGUAGE plpgsql SECURITY DEFINER;
-- =============================================
-- 2. CREATE TRIGGER FOR NEW AUTH USERS
-- =============================================
-- Drop the trigger if it exists to avoid conflicts
DROP TRIGGER IF EXISTS on_auth_user_created ON auth.users;
-- Create trigger that fires after a new user is created in auth.users
CREATE TRIGGER on_auth_user_created
AFTER INSERT ON auth.users
FOR EACH ROW EXECUTE FUNCTION public.handle_new_oauth_user();
-- =============================================
-- 3. COMMENT FOR DOCUMENTATION
-- =============================================
COMMENT ON FUNCTION public.handle_new_oauth_user() IS
'Automatically creates a user profile in public.user_profiles when a new user is created via OAuth in auth.users. This ensures OAuth users are immediately accessible in the application without manual provisioning.';

179
module.nix Normal file
View File

@@ -0,0 +1,179 @@
{ config, lib, pkgs, ... }:
let
cfg = config.services.usda-vision;
# Get packages from the package option (must be provided)
camera-sdk = cfg.package.camera-sdk;
usda-vision-app = cfg.package.usda-vision;
in
{
options.services.usda-vision = {
enable = lib.mkEnableOption "USDA Vision system";
package = lib.mkOption {
type = lib.types.attrs;
default = {};
description = "Package set containing camera-sdk and usda-vision packages";
};
hostname = lib.mkOption {
type = lib.types.str;
default = "exp-dash";
description = "Hostname or IP address to replace exp-dash and localhost with in configuration";
};
replaceHostnames = lib.mkOption {
type = lib.types.bool;
default = false;
description = "Whether to replace exp-dash and localhost hostnames with the configured hostname";
};
envFile = lib.mkOption {
type = lib.types.nullOr lib.types.path;
default = null;
description = "Path to environment file (managed by ragenix in deployment)";
};
};
config = lib.mkIf cfg.enable {
# System packages
environment.systemPackages = with pkgs; [
docker
docker-compose
supabase-cli
camera-sdk
usda-vision-app
];
# Make camera SDK libraries available system-wide
environment.variables = {
LD_LIBRARY_PATH = "${camera-sdk}/lib";
};
# Enable Docker service
virtualisation.docker = {
enable = true;
autoPrune.enable = true;
daemon.settings = {
experimental = true;
};
};
# Create persistent directories
systemd.tmpfiles.rules = [
"d /var/lib/usda-vision 0755 root root -"
"f /var/lib/usda-vision/.env 0644 root root -"
"d /var/lib/supabase 0755 root root -"
];
# Supabase CLI service
systemd.services.supabase-cli = {
enable = true;
description = "Supabase CLI Service";
preStart = ''
rm -rf /var/lib/supabase/*
rm -rf /var/lib/supabase/.* 2>/dev/null || true
if [ -d ${usda-vision-app}/opt/usda-vision/supabase ]; then
${pkgs.rsync}/bin/rsync -av ${usda-vision-app}/opt/usda-vision/supabase/ /var/lib/supabase/supabase/
fi
mkdir -p /var/lib/supabase/supabase/.branches
chmod -R 755 /var/lib/supabase
'';
serviceConfig = {
WorkingDirectory = "/var/lib/supabase";
EnvironmentFile = lib.mkIf (cfg.envFile != null) cfg.envFile;
ExecStart = "${pkgs.supabase-cli}/bin/supabase start";
ExecStop = "${pkgs.supabase-cli}/bin/supabase stop";
Type = "oneshot";
RemainAfterExit = true;
User = "root";
Group = "root";
};
};
# USDA Vision docker compose service
systemd.services.usda-vision = {
description = "USDA Vision Docker Compose Stack";
after = [ "docker.service" "network-online.target" "systemd-tmpfiles-setup.service" ];
wants = [ "network-online.target" ];
wantedBy = [ "multi-user.target" ];
unitConfig = lib.mkIf (cfg.envFile != null) {
ConditionPathExists = cfg.envFile;
};
preStart = ''
echo "Syncing application code to /var/lib/usda-vision..."
${pkgs.rsync}/bin/rsync -av --delete \
--checksum \
--exclude='node_modules' \
--exclude='.env' \
--exclude='.env.azure' \
--exclude='__pycache__' \
--exclude='.venv' \
${usda-vision-app}/opt/usda-vision/ /var/lib/usda-vision/
${lib.optionalString cfg.replaceHostnames ''
echo "Replacing hostnames (exp-dash, localhost) with ${cfg.hostname} in docker-compose.yml..."
${pkgs.gnused}/bin/sed -i \
-e 's|exp-dash|${cfg.hostname}|g' \
-e 's|localhost|${cfg.hostname}|g' \
/var/lib/usda-vision/docker-compose.yml
''}
# Configure docker-compose to use Nix-provided camera SDK
echo "Configuring camera SDK in docker-compose.yml..."
${pkgs.gnused}/bin/sed -i \
-e '/^ - \/etc\/timezone:\/etc\/timezone:ro$/a\ - ${camera-sdk}/lib:/opt/camera-sdk/lib:ro' \
-e 's|LD_LIBRARY_PATH=/usr/local/lib:/lib:/usr/lib|LD_LIBRARY_PATH=/opt/camera-sdk/lib:/usr/local/lib:/lib:/usr/lib|' \
/var/lib/usda-vision/docker-compose.yml
# Fix env_file paths to point to /var/lib/usda-vision/.env
echo "Fixing env_file paths in docker-compose.yml..."
${pkgs.gnused}/bin/sed -i \
's|/var/lib/usda-vision/management-dashboard-web-app/\.env|/var/lib/usda-vision/.env|g' \
/var/lib/usda-vision/docker-compose.yml
# Replace [REDACTED] placeholders with actual VITE_SUPABASE_ANON_KEY reference
echo "Fixing ANON_KEY references in docker-compose.yml..."
${pkgs.gnused}/bin/sed -i \
's|\[REDACTED\]|$${VITE_SUPABASE_ANON_KEY}|g' \
/var/lib/usda-vision/docker-compose.yml
if [ -n "${cfg.envFile}" ]; then
echo "Copying environment file from managed secret..."
cp ${cfg.envFile} /var/lib/usda-vision/.env
chmod 644 /var/lib/usda-vision/.env
fi
${lib.optionalString (cfg.envFile == null) ''
if [ ! -s /var/lib/usda-vision/.env ]; then
if [ -f ${usda-vision-app}/opt/usda-vision/.env.example ]; then
echo "WARNING: No environment file provided, using .env.example"
cp ${usda-vision-app}/opt/usda-vision/.env.example /var/lib/usda-vision/.env
fi
fi
''}
'';
serviceConfig = {
Type = "oneshot";
RemainAfterExit = true;
WorkingDirectory = "/var/lib/usda-vision";
User = "root";
Group = "root";
ExecStart = "${pkgs.docker-compose}/bin/docker-compose -f /var/lib/usda-vision/docker-compose.yml up -d --build";
ExecStop = "${pkgs.docker-compose}/bin/docker-compose -f /var/lib/usda-vision/docker-compose.yml down";
ExecReload = "${pkgs.bash}/bin/bash -c '${pkgs.docker-compose}/bin/docker-compose -f /var/lib/usda-vision/docker-compose.yml down && ${pkgs.docker-compose}/bin/docker-compose -f /var/lib/usda-vision/docker-compose.yml up -d --build'";
TimeoutStartSec = 300;
TimeoutStopSec = 120;
};
};
};
}

61
package-lock.json generated Normal file
View File

@@ -0,0 +1,61 @@
{
"name": "usda-vision",
"lockfileVersion": 3,
"requires": true,
"packages": {
"": {
"dependencies": {
"react": "^19.2.3",
"react-dom": "^19.2.3"
},
"devDependencies": {
"@types/react": "^19.2.7"
}
},
"node_modules/@types/react": {
"version": "19.2.7",
"resolved": "https://registry.npmjs.org/@types/react/-/react-19.2.7.tgz",
"integrity": "sha512-MWtvHrGZLFttgeEj28VXHxpmwYbor/ATPYbBfSFZEIRK0ecCFLl2Qo55z52Hss+UV9CRN7trSeq1zbgx7YDWWg==",
"dev": true,
"license": "MIT",
"dependencies": {
"csstype": "^3.2.2"
}
},
"node_modules/csstype": {
"version": "3.2.3",
"resolved": "https://registry.npmjs.org/csstype/-/csstype-3.2.3.tgz",
"integrity": "sha512-z1HGKcYy2xA8AGQfwrn0PAy+PB7X/GSj3UVJW9qKyn43xWa+gl5nXmU4qqLMRzWVLFC8KusUX8T/0kCiOYpAIQ==",
"dev": true,
"license": "MIT"
},
"node_modules/react": {
"version": "19.2.3",
"resolved": "https://registry.npmjs.org/react/-/react-19.2.3.tgz",
"integrity": "sha512-Ku/hhYbVjOQnXDZFv2+RibmLFGwFdeeKHFcOTlrt7xplBnya5OGn/hIRDsqDiSUcfORsDC7MPxwork8jBwsIWA==",
"license": "MIT",
"peer": true,
"engines": {
"node": ">=0.10.0"
}
},
"node_modules/react-dom": {
"version": "19.2.3",
"resolved": "https://registry.npmjs.org/react-dom/-/react-dom-19.2.3.tgz",
"integrity": "sha512-yELu4WmLPw5Mr/lmeEpox5rw3RETacE++JgHqQzd2dg+YbJuat3jH4ingc+WPZhxaoFzdv9y33G+F7Nl5O0GBg==",
"license": "MIT",
"dependencies": {
"scheduler": "^0.27.0"
},
"peerDependencies": {
"react": "^19.2.3"
}
},
"node_modules/scheduler": {
"version": "0.27.0",
"resolved": "https://registry.npmjs.org/scheduler/-/scheduler-0.27.0.tgz",
"integrity": "sha512-eNv+WrVbKu1f3vbYJT/xtiF5syA5HPIMtf9IgY/nKg0sWqzAUEvqY/xm7OcZc/qafLx/iO9FgOmeSAp4v5ti/Q==",
"license": "MIT"
}
}
}

9
package.json Normal file
View File

@@ -0,0 +1,9 @@
{
"dependencies": {
"react": "^19.2.3",
"react-dom": "^19.2.3"
},
"devDependencies": {
"@types/react": "^19.2.7"
}
}

129
package.nix Normal file
View File

@@ -0,0 +1,129 @@
{ lib
, stdenv
, makeWrapper
, rsync
, gnused
, gawk
, docker-compose
}:
stdenv.mkDerivation {
pname = "usda-vision";
version = "1.0.0";
# Use the directory from this repository with explicit source filtering
src = lib.cleanSourceWith {
src = ./.;
filter = path: type:
let
baseName = baseNameOf path;
in
# Exclude git, but include everything else
baseName != ".git" &&
baseName != ".cursor" &&
baseName != "__pycache__" &&
baseName != "node_modules" &&
baseName != ".venv" &&
baseName != ".age" &&
baseName != "flake.nix" &&
baseName != "flake.lock" &&
baseName != "package.nix" &&
baseName != "camera-sdk.nix";
};
nativeBuildInputs = [ makeWrapper rsync ];
# Don't run these phases, we'll do everything in installPhase
dontBuild = true;
dontConfigure = true;
installPhase = ''
mkdir -p $out/opt/usda-vision
# Debug: show what's in source
echo "Source directory contents:"
ls -la $src/ || true
# Process docker-compose.yml - replace paths and configure SDK from Nix
if [ -f $src/docker-compose.yml ]; then
# Basic path replacements with sed
${gnused}/bin/sed \
-e 's|\./management-dashboard-web-app|/var/lib/usda-vision/management-dashboard-web-app|g' \
-e 's|\./media-api|/var/lib/usda-vision/media-api|g' \
-e 's|\./video-remote|/var/lib/usda-vision/video-remote|g' \
-e 's|\./scheduling-remote|/var/lib/usda-vision/scheduling-remote|g' \
-e 's|\./vision-system-remote|/var/lib/usda-vision/vision-system-remote|g' \
-e 's|\./camera-management-api|/var/lib/usda-vision/camera-management-api|g' \
$src/docker-compose.yml > $TMPDIR/docker-compose-step1.yml
# Remove SDK installation blocks using awk for better multi-line handling
${gawk}/bin/awk '
/# Only install system packages if not already installed/ { skip=1 }
skip && /^ fi$/ { skip=0; next }
/# Install camera SDK if not already installed/ { skip_sdk=1 }
skip_sdk && /^ fi;$/ { skip_sdk=0; next }
!skip && !skip_sdk { print }
' $TMPDIR/docker-compose-step1.yml > $TMPDIR/docker-compose.yml
rm -f $TMPDIR/docker-compose-step1.yml
fi
# Copy all application files using rsync with chmod
${rsync}/bin/rsync -av \
--chmod=Du+w \
--exclude='.git' \
--exclude='docker-compose.yml' \
--exclude='.env' \
--exclude='*.age' \
--exclude='flake.nix' \
--exclude='flake.lock' \
--exclude='package.nix' \
--exclude='camera-sdk.nix' \
--exclude='management-dashboard-web-app/.env' \
$src/ $out/opt/usda-vision/
# Copy the processed docker-compose.yml
if [ -f $TMPDIR/docker-compose.yml ]; then
cp $TMPDIR/docker-compose.yml $out/opt/usda-vision/docker-compose.yml
fi
# Verify files were copied
echo "Destination directory contents:"
ls -la $out/opt/usda-vision/ || true
# Create convenience scripts
mkdir -p $out/bin
cat > $out/bin/usda-vision-start <<'EOF'
#!/usr/bin/env bash
cd $out/opt/usda-vision
${docker-compose}/bin/docker-compose up -d --build
EOF
cat > $out/bin/usda-vision-stop <<'EOF'
#!/usr/bin/env bash
cd $out/opt/usda-vision
${docker-compose}/bin/docker-compose down
EOF
cat > $out/bin/usda-vision-logs <<'EOF'
#!/usr/bin/env bash
cd $out/opt/usda-vision
${docker-compose}/bin/docker-compose logs -f "$@"
EOF
cat > $out/bin/usda-vision-restart <<'EOF'
#!/usr/bin/env bash
cd $out/opt/usda-vision
${docker-compose}/bin/docker-compose restart "$@"
EOF
chmod +x $out/bin/usda-vision-*
'';
meta = with lib; {
description = "USDA Vision camera management system";
maintainers = [ "UGA Innovation Factory" ];
platforms = platforms.linux;
};
}

View File

@@ -9,7 +9,7 @@
"build:watch": "vite build --watch",
"serve:dist": "serve -s dist -l 3003",
"preview": "vite preview --port 3003",
"dev:watch": "npm run build && (npm run build:watch &) && sleep 1 && npx http-server dist -p 3003 --cors -c-1"
"dev:watch": "./wait-and-serve.sh"
},
"dependencies": {
"@supabase/supabase-js": "^2.52.0",

File diff suppressed because it is too large Load Diff

View File

@@ -70,8 +70,6 @@ export function ScheduleExperiment({ user, onBack }: { user: User; onBack: () =>
// Track repetitions that have been dropped/moved and should show time points
const [repetitionsWithTimes, setRepetitionsWithTimes] = useState<Set<string>>(new Set())
// Track which repetitions are locked (prevent dragging)
const [lockedSchedules, setLockedSchedules] = useState<Set<string>>(new Set())
// Track which repetitions are currently being scheduled
const [schedulingRepetitions, setSchedulingRepetitions] = useState<Set<string>>(new Set())
// Track conductor assignments for each phase marker (markerId -> conductorIds[])
@@ -253,44 +251,22 @@ export function ScheduleExperiment({ user, onBack }: { user: User; onBack: () =>
}
const toggleRepetition = (repId: string) => {
// Checking/unchecking should only control visibility on the timeline.
// It must NOT clear scheduling info or conductor assignments.
setSelectedRepetitionIds(prev => {
const next = new Set(prev)
if (next.has(repId)) {
// Hide this repetition from the timeline
next.delete(repId)
// Remove from scheduled repetitions when unchecked
setScheduledRepetitions(prevScheduled => {
const newScheduled = { ...prevScheduled }
delete newScheduled[repId]
return newScheduled
})
// Clear all related state when unchecked
setRepetitionsWithTimes(prev => {
const next = new Set(prev)
next.delete(repId)
return next
})
setLockedSchedules(prev => {
const next = new Set(prev)
next.delete(repId)
return next
})
setSchedulingRepetitions(prev => {
const next = new Set(prev)
next.delete(repId)
return next
})
// Re-stagger remaining repetitions
const remainingIds = Array.from(next).filter(id => id !== repId)
if (remainingIds.length > 0) {
reStaggerRepetitions(remainingIds)
}
// Keep scheduledRepetitions and repetitionsWithTimes intact so that
// re-checking the box restores the repetition in the correct spot.
} else {
// Show this repetition on the timeline
next.add(repId)
// Auto-spawn when checked - pass the updated set to ensure correct stagger calculation
// spawnSingleRepetition will position the new repetition relative to existing ones
// without resetting existing positions
spawnSingleRepetition(repId, next)
// Re-stagger all existing repetitions to prevent overlap
// Note: reStaggerRepetitions will automatically skip locked repetitions
reStaggerRepetitions([...next, repId])
}
return next
})
@@ -305,20 +281,14 @@ export function ScheduleExperiment({ user, onBack }: { user: User; onBack: () =>
const allSelected = allRepetitions.every(rep => prev.has(rep.id))
if (allSelected) {
// Deselect all repetitions in this phase
// Deselect all repetitions in this phase (hide from timeline only)
const next = new Set(prev)
allRepetitions.forEach(rep => {
next.delete(rep.id)
// Remove from scheduled repetitions
setScheduledRepetitions(prevScheduled => {
const newScheduled = { ...prevScheduled }
delete newScheduled[rep.id]
return newScheduled
})
})
return next
} else {
// Select all repetitions in this phase
// Select all repetitions in this phase (show on timeline)
const next = new Set(prev)
allRepetitions.forEach(rep => {
next.add(rep.id)
@@ -356,7 +326,7 @@ export function ScheduleExperiment({ user, onBack }: { user: User; onBack: () =>
// Re-stagger all repetitions to prevent overlap
// IMPORTANT: Skip locked repetitions to prevent them from moving
const reStaggerRepetitions = useCallback((repIds: string[]) => {
const reStaggerRepetitions = useCallback((repIds: string[], onlyResetWithoutCustomTimes: boolean = false) => {
const tomorrow = new Date()
tomorrow.setDate(tomorrow.getDate() + 1)
tomorrow.setHours(9, 0, 0, 0)
@@ -364,8 +334,11 @@ export function ScheduleExperiment({ user, onBack }: { user: User; onBack: () =>
setScheduledRepetitions(prev => {
const newScheduled = { ...prev }
// Filter out locked repetitions - they should not be moved
const unlockedRepIds = repIds.filter(repId => !lockedSchedules.has(repId))
// If onlyResetWithoutCustomTimes is true, filter out repetitions that have custom times set
let unlockedRepIds = repIds
if (onlyResetWithoutCustomTimes) {
unlockedRepIds = unlockedRepIds.filter(repId => !repetitionsWithTimes.has(repId))
}
// Calculate stagger index only for unlocked repetitions
let staggerIndex = 0
@@ -407,7 +380,7 @@ export function ScheduleExperiment({ user, onBack }: { user: User; onBack: () =>
return newScheduled
})
}, [lockedSchedules, repetitionsByExperiment, experimentsByPhase, soakingByExperiment, airdryingByExperiment])
}, [repetitionsByExperiment, experimentsByPhase, soakingByExperiment, airdryingByExperiment, repetitionsWithTimes])
// Spawn a single repetition in calendar
const spawnSingleRepetition = (repId: string, updatedSelectedIds?: Set<string>) => {
@@ -477,10 +450,11 @@ export function ScheduleExperiment({ user, onBack }: { user: User; onBack: () =>
let newScheduled = { ...prev }
const clampToReasonableHours = (d: Date) => {
// Allow full 24 hours (midnight to midnight)
const min = new Date(d)
min.setHours(5, 0, 0, 0)
min.setHours(0, 0, 0, 0)
const max = new Date(d)
max.setHours(23, 0, 0, 0)
max.setHours(23, 59, 59, 999)
const t = d.getTime()
return new Date(Math.min(Math.max(t, min.getTime()), max.getTime()))
}
@@ -536,13 +510,10 @@ export function ScheduleExperiment({ user, onBack }: { user: User; onBack: () =>
const repetition = repetitionsByExperiment[scheduled.experimentId]?.find(r => r.id === scheduled.repetitionId)
if (experiment && repetition && scheduled.soakingStart) {
const isLocked = lockedSchedules.has(scheduled.repetitionId)
const lockIcon = isLocked ? '🔒' : '🔓'
// Soaking marker
events.push({
id: `${scheduled.repetitionId}-soaking`,
title: `${lockIcon} 💧 Soaking - Exp ${experiment.experiment_number} Rep ${repetition.repetition_number}`,
title: `💧 Soaking - Exp ${experiment.experiment_number} Rep ${repetition.repetition_number}`,
start: scheduled.soakingStart,
end: new Date(scheduled.soakingStart.getTime() + 15 * 60000), // 15 minute duration for better visibility
resource: 'soaking'
@@ -552,7 +523,7 @@ export function ScheduleExperiment({ user, onBack }: { user: User; onBack: () =>
if (scheduled.airdryingStart) {
events.push({
id: `${scheduled.repetitionId}-airdrying`,
title: `${lockIcon} 🌬️ Airdrying - Exp ${experiment.experiment_number} Rep ${repetition.repetition_number}`,
title: `🌬️ Airdrying - Exp ${experiment.experiment_number} Rep ${repetition.repetition_number}`,
start: scheduled.airdryingStart,
end: new Date(scheduled.airdryingStart.getTime() + 15 * 60000), // 15 minute duration for better visibility
resource: 'airdrying'
@@ -563,7 +534,7 @@ export function ScheduleExperiment({ user, onBack }: { user: User; onBack: () =>
if (scheduled.crackingStart) {
events.push({
id: `${scheduled.repetitionId}-cracking`,
title: `${lockIcon} ⚡ Cracking - Exp ${experiment.experiment_number} Rep ${repetition.repetition_number}`,
title: `⚡ Cracking - Exp ${experiment.experiment_number} Rep ${repetition.repetition_number}`,
start: scheduled.crackingStart,
end: new Date(scheduled.crackingStart.getTime() + 15 * 60000), // 15 minute duration for better visibility
resource: 'cracking'
@@ -573,7 +544,7 @@ export function ScheduleExperiment({ user, onBack }: { user: User; onBack: () =>
})
return events
}, [scheduledRepetitions, experimentsByPhase, repetitionsByExperiment, lockedSchedules])
}, [scheduledRepetitions, experimentsByPhase, repetitionsByExperiment])
// Memoize the calendar events
const calendarEvents = useMemo(() => {
@@ -609,15 +580,16 @@ export function ScheduleExperiment({ user, onBack }: { user: User; onBack: () =>
return moment(date).format('MMM D, h:mm A')
}
const toggleScheduleLock = (repId: string) => {
setLockedSchedules(prev => {
const next = new Set(prev)
if (next.has(repId)) {
next.delete(repId)
} else {
next.add(repId)
}
return next
// Remove all conductor assignments from a repetition
const removeRepetitionAssignments = (repId: string) => {
const markerIdPrefix = repId
setConductorAssignments(prev => {
const newAssignments = { ...prev }
// Remove assignments for all three phases
delete newAssignments[`${markerIdPrefix}-soaking`]
delete newAssignments[`${markerIdPrefix}-airdrying`]
delete newAssignments[`${markerIdPrefix}-cracking`]
return newAssignments
})
}
@@ -625,24 +597,16 @@ export function ScheduleExperiment({ user, onBack }: { user: User; onBack: () =>
// Only make repetition markers draggable, not availability events
const resource = event.resource as string
if (resource === 'soaking' || resource === 'airdrying' || resource === 'cracking') {
// Check if the repetition is locked
const eventId = event.id as string
const repId = eventId.split('-')[0]
const isLocked = lockedSchedules.has(repId)
return !isLocked
return true
}
return false
}, [lockedSchedules])
}, [])
const eventPropGetter = useCallback((event: any) => {
const resource = event.resource as string
// Styling for repetition markers (foreground events)
if (resource === 'soaking' || resource === 'airdrying' || resource === 'cracking') {
const eventId = event.id as string
const repId = eventId.split('-')[0]
const isLocked = lockedSchedules.has(repId)
const colors = {
soaking: '#3b82f6', // blue
airdrying: '#10b981', // green
@@ -652,8 +616,8 @@ export function ScheduleExperiment({ user, onBack }: { user: User; onBack: () =>
return {
style: {
backgroundColor: isLocked ? '#9ca3af' : color, // gray if locked
borderColor: isLocked ? color : color, // border takes original color when locked
backgroundColor: color,
borderColor: color,
color: 'white',
borderRadius: '8px',
border: '2px solid',
@@ -674,17 +638,17 @@ export function ScheduleExperiment({ user, onBack }: { user: User; onBack: () =>
overflow: 'hidden',
textOverflow: 'ellipsis',
whiteSpace: 'nowrap',
cursor: isLocked ? 'not-allowed' : 'grab',
boxShadow: isLocked ? '0 1px 2px rgba(0,0,0,0.1)' : '0 2px 4px rgba(0,0,0,0.2)',
cursor: 'grab',
boxShadow: '0 2px 4px rgba(0,0,0,0.2)',
transition: 'all 0.2s ease',
opacity: isLocked ? 0.7 : 1
opacity: 1
}
}
}
// Default styling for other events
return {}
}, [lockedSchedules])
}, [])
const scheduleRepetition = async (repId: string, experimentId: string) => {
setSchedulingRepetitions(prev => new Set(prev).add(repId))
@@ -756,6 +720,51 @@ export function ScheduleExperiment({ user, onBack }: { user: User; onBack: () =>
}
}
// Unschedule a repetition: clear its scheduling info and unassign all conductors.
const unscheduleRepetition = async (repId: string, experimentId: string) => {
setSchedulingRepetitions(prev => new Set(prev).add(repId))
try {
// Remove all conductor assignments for this repetition
removeRepetitionAssignments(repId)
// Clear scheduled_date on the repetition in local state
setRepetitionsByExperiment(prev => ({
...prev,
[experimentId]: prev[experimentId]?.map(r =>
r.id === repId ? { ...r, scheduled_date: null } : r
) || []
}))
// Clear scheduled times for this repetition so it disappears from the timeline
setScheduledRepetitions(prev => {
const next = { ...prev }
delete next[repId]
return next
})
// This repetition no longer has active times
setRepetitionsWithTimes(prev => {
const next = new Set(prev)
next.delete(repId)
return next
})
// Also clear scheduled_date in the database for this repetition
await repetitionManagement.updateRepetition(repId, {
scheduled_date: null
})
} catch (error: any) {
setError(error?.message || 'Failed to unschedule repetition')
} finally {
setSchedulingRepetitions(prev => {
const next = new Set(prev)
next.delete(repId)
return next
})
}
}
// Restore scroll position after scheduledRepetitions changes
useEffect(() => {
if (scrollPositionRef.current) {
@@ -806,11 +815,15 @@ export function ScheduleExperiment({ user, onBack }: { user: User; onBack: () =>
phase: 'soaking' | 'airdrying' | 'cracking'
startTime: Date
assignedConductors: string[]
locked: boolean
}> = []
Object.values(scheduledRepetitions).forEach(scheduled => {
const repId = scheduled.repetitionId
// Only include markers for repetitions that are checked (selected)
if (!selectedRepetitionIds.has(repId)) {
return
}
const markerIdPrefix = repId
if (scheduled.soakingStart) {
@@ -820,8 +833,7 @@ export function ScheduleExperiment({ user, onBack }: { user: User; onBack: () =>
experimentId: scheduled.experimentId,
phase: 'soaking',
startTime: scheduled.soakingStart,
assignedConductors: conductorAssignments[`${markerIdPrefix}-soaking`] || [],
locked: lockedSchedules.has(repId)
assignedConductors: conductorAssignments[`${markerIdPrefix}-soaking`] || []
})
}
@@ -832,8 +844,7 @@ export function ScheduleExperiment({ user, onBack }: { user: User; onBack: () =>
experimentId: scheduled.experimentId,
phase: 'airdrying',
startTime: scheduled.airdryingStart,
assignedConductors: conductorAssignments[`${markerIdPrefix}-airdrying`] || [],
locked: lockedSchedules.has(repId)
assignedConductors: conductorAssignments[`${markerIdPrefix}-airdrying`] || []
})
}
@@ -844,8 +855,7 @@ export function ScheduleExperiment({ user, onBack }: { user: User; onBack: () =>
experimentId: scheduled.experimentId,
phase: 'cracking',
startTime: scheduled.crackingStart,
assignedConductors: conductorAssignments[`${markerIdPrefix}-cracking`] || [],
locked: lockedSchedules.has(repId)
assignedConductors: conductorAssignments[`${markerIdPrefix}-cracking`] || []
})
}
})
@@ -856,7 +866,66 @@ export function ScheduleExperiment({ user, onBack }: { user: User; onBack: () =>
conductorAvailabilities,
phaseMarkers
}
}, [selectedConductorIds, conductors, conductorColorMap, colorPalette, availabilityEvents, scheduledRepetitions, conductorAssignments, lockedSchedules, calendarStartDate, calendarZoom])
}, [selectedConductorIds, conductors, conductorColorMap, colorPalette, availabilityEvents, scheduledRepetitions, conductorAssignments, calendarStartDate, calendarZoom, selectedRepetitionIds])
// Build repetition metadata mapping for timeline display
const repetitionMetadata = useMemo(() => {
const metadata: Record<string, { phaseName: string; experimentNumber: number; repetitionNumber: number; experimentId: string; isScheduledInDb: boolean }> = {}
Object.values(scheduledRepetitions).forEach(scheduled => {
const repId = scheduled.repetitionId
// Only include metadata for repetitions that are checked (selected)
if (!selectedRepetitionIds.has(repId)) {
return
}
const experiment = Object.values(experimentsByPhase).flat().find(e => e.id === scheduled.experimentId)
const repetition = Object.values(repetitionsByExperiment).flat().find(r => r.id === repId)
const phase = phases.find(p =>
Object.values(experimentsByPhase[p.id] || []).some(e => e.id === scheduled.experimentId)
)
if (experiment && repetition && phase) {
metadata[repId] = {
phaseName: phase.name,
experimentNumber: experiment.experiment_number,
repetitionNumber: repetition.repetition_number,
experimentId: scheduled.experimentId,
// Consider a repetition \"scheduled\" in DB if it has a non-null scheduled_date
isScheduledInDb: Boolean(repetition.scheduled_date)
}
}
})
return metadata
}, [scheduledRepetitions, experimentsByPhase, repetitionsByExperiment, phases, selectedRepetitionIds])
// Scroll to repetition in accordion
const handleScrollToRepetition = useCallback(async (repetitionId: string) => {
// First, expand the phase if it's collapsed
const repetition = Object.values(repetitionsByExperiment).flat().find(r => r.id === repetitionId)
if (repetition) {
const experiment = Object.values(experimentsByPhase).flat().find(e =>
(repetitionsByExperiment[e.id] || []).some(r => r.id === repetitionId)
)
if (experiment) {
const phase = phases.find(p =>
(experimentsByPhase[p.id] || []).some(e => e.id === experiment.id)
)
if (phase && !expandedPhaseIds.has(phase.id)) {
await togglePhaseExpand(phase.id)
// Wait a bit for the accordion to expand
await new Promise(resolve => setTimeout(resolve, 300))
}
}
}
// Then scroll to the element
const element = document.getElementById(`repetition-${repetitionId}`)
if (element) {
element.scrollIntoView({ behavior: 'smooth', block: 'center' })
}
}, [repetitionsByExperiment, experimentsByPhase, phases, expandedPhaseIds, togglePhaseExpand])
// Handlers for horizontal calendar
const handleHorizontalMarkerDrag = useCallback((markerId: string, newTime: Date) => {
@@ -878,21 +947,6 @@ export function ScheduleExperiment({ user, onBack }: { user: User; onBack: () =>
}))
}, [])
const handleHorizontalMarkerLockToggle = useCallback((markerId: string) => {
// Marker ID format: ${repId}-${phase} where repId is a UUID with hyphens
// Split by '-' and take all but the last segment as repId
const parts = markerId.split('-')
const repId = parts.slice(0, -1).join('-')
setLockedSchedules(prev => {
const next = new Set(prev)
if (next.has(repId)) {
next.delete(repId)
} else {
next.add(repId)
}
return next
})
}, [])
return (
@@ -1027,7 +1081,9 @@ export function ScheduleExperiment({ user, onBack }: { user: User; onBack: () =>
phaseMarkers={horizontalCalendarData.phaseMarkers}
onMarkerDrag={handleHorizontalMarkerDrag}
onMarkerAssignConductors={handleHorizontalMarkerAssignConductors}
onMarkerLockToggle={handleHorizontalMarkerLockToggle}
repetitionMetadata={repetitionMetadata}
onScrollToRepetition={handleScrollToRepetition}
onScheduleRepetition={scheduleRepetition}
timeStep={15}
minHour={6}
maxHour={22}
@@ -1196,11 +1252,21 @@ export function ScheduleExperiment({ user, onBack }: { user: User; onBack: () =>
const checked = selectedRepetitionIds.has(rep.id)
const hasTimes = repetitionsWithTimes.has(rep.id)
const scheduled = scheduledRepetitions[rep.id]
const isLocked = lockedSchedules.has(rep.id)
const isScheduling = schedulingRepetitions.has(rep.id)
// Check if there are any conductor assignments
const markerIdPrefix = rep.id
const soakingConductors = conductorAssignments[`${markerIdPrefix}-soaking`] || []
const airdryingConductors = conductorAssignments[`${markerIdPrefix}-airdrying`] || []
const crackingConductors = conductorAssignments[`${markerIdPrefix}-cracking`] || []
const hasAssignments = soakingConductors.length > 0 || airdryingConductors.length > 0 || crackingConductors.length > 0
return (
<div key={rep.id} className="bg-white dark:bg-gray-900 border border-gray-200 dark:border-gray-700 rounded p-3">
<div
key={rep.id}
id={`repetition-${rep.id}`}
className="bg-white dark:bg-gray-900 border border-gray-200 dark:border-gray-700 rounded p-3"
>
{/* Checkbox row */}
<label className="flex items-center gap-2">
<input
@@ -1212,44 +1278,89 @@ export function ScheduleExperiment({ user, onBack }: { user: User; onBack: () =>
<span className="text-sm font-medium text-gray-700 dark:text-gray-300">Rep {rep.repetition_number}</span>
</label>
{/* Time points (shown only if has been dropped/moved) */}
{hasTimes && scheduled && (
{/* Time points (shown whenever the repetition has scheduled times) */}
{scheduled && (
<div className="mt-2 ml-6 text-xs space-y-1">
<div className="flex items-center gap-2">
<span>💧</span>
<span>Soaking: {formatTime(scheduled.soakingStart)}</span>
</div>
<div className="flex items-center gap-2">
<span>🌬️</span>
<span>Airdrying: {formatTime(scheduled.airdryingStart)}</span>
</div>
<div className="flex items-center gap-2">
<span>⚡</span>
<span>Cracking: {formatTime(scheduled.crackingStart)}</span>
</div>
{(() => {
const repId = rep.id
const markerIdPrefix = repId
// Get assigned conductors for each phase
const soakingConductors = conductorAssignments[`${markerIdPrefix}-soaking`] || []
const airdryingConductors = conductorAssignments[`${markerIdPrefix}-airdrying`] || []
const crackingConductors = conductorAssignments[`${markerIdPrefix}-cracking`] || []
// Helper to get conductor names
const getConductorNames = (conductorIds: string[]) => {
return conductorIds.map(id => {
const conductor = conductors.find(c => c.id === id)
if (!conductor) return null
return [conductor.first_name, conductor.last_name].filter(Boolean).join(' ') || conductor.email
}).filter(Boolean).join(', ')
}
return (
<>
<div className="flex items-center gap-2 flex-wrap">
<span>💧</span>
<span>Soaking: {formatTime(scheduled.soakingStart)}</span>
{soakingConductors.length > 0 && (
<span className="text-blue-600 dark:text-blue-400">
({getConductorNames(soakingConductors)})
</span>
)}
</div>
<div className="flex items-center gap-2 flex-wrap">
<span>🌬️</span>
<span>Airdrying: {formatTime(scheduled.airdryingStart)}</span>
{airdryingConductors.length > 0 && (
<span className="text-green-600 dark:text-green-400">
({getConductorNames(airdryingConductors)})
</span>
)}
</div>
<div className="flex items-center gap-2 flex-wrap">
<span>⚡</span>
<span>Cracking: {formatTime(scheduled.crackingStart)}</span>
{crackingConductors.length > 0 && (
<span className="text-orange-600 dark:text-orange-400">
({getConductorNames(crackingConductors)})
</span>
)}
</div>
</>
)
})()}
{/* Lock checkbox and Schedule button */}
{/* Remove Assignments button and Schedule/Unschedule button */}
<div className="flex items-center gap-3 mt-3 pt-2 border-t border-gray-200 dark:border-gray-600">
<label className="flex items-center gap-2">
<input
type="checkbox"
className="h-3 w-3 text-blue-600 border-gray-300 rounded"
checked={isLocked}
onChange={() => {
toggleScheduleLock(rep.id)
}}
/>
<span className="text-xs text-gray-600 dark:text-gray-400">
{isLocked ? '🔒 Locked' : '🔓 Unlocked'}
</span>
</label>
<button
onClick={() => scheduleRepetition(rep.id, exp.id)}
disabled={isScheduling || !isLocked}
className="px-3 py-1 bg-green-600 hover:bg-green-700 disabled:bg-gray-400 text-white rounded text-xs transition-colors"
>
{isScheduling ? 'Scheduling...' : 'Schedule'}
</button>
{hasAssignments && (
<button
onClick={() => removeRepetitionAssignments(rep.id)}
disabled={Boolean(rep.scheduled_date)}
className="px-3 py-1 bg-red-600 hover:bg-red-700 disabled:bg-gray-400 text-white rounded text-xs transition-colors"
title={rep.scheduled_date ? "Unschedule the repetition first before removing assignments" : "Remove all conductor assignments from this repetition"}
>
Remove Assignments
</button>
)}
{rep.scheduled_date ? (
<button
onClick={() => unscheduleRepetition(rep.id, exp.id)}
disabled={isScheduling}
className="px-3 py-1 bg-yellow-600 hover:bg-yellow-700 disabled:bg-gray-400 text-white rounded text-xs transition-colors"
>
{isScheduling ? 'Unscheduling...' : 'Unschedule'}
</button>
) : (
<button
onClick={() => scheduleRepetition(rep.id, exp.id)}
disabled={isScheduling}
className="px-3 py-1 bg-green-600 hover:bg-green-700 disabled:bg-gray-400 text-white rounded text-xs transition-colors"
>
{isScheduling ? 'Scheduling...' : 'Schedule'}
</button>
)}
</div>
</div>
)}

View File

@@ -0,0 +1,57 @@
#!/bin/sh
# Build the project first
echo "Building scheduling-remote..."
npm run build
# Verify the initial build created remoteEntry.js
REMOTE_ENTRY_PATH="dist/assets/remoteEntry.js"
if [ ! -f "$REMOTE_ENTRY_PATH" ]; then
echo "ERROR: Initial build did not create remoteEntry.js!"
exit 1
fi
echo "Initial build complete. remoteEntry.js exists."
# Start build:watch in the background
echo "Starting build:watch in background..."
npm run build:watch &
BUILD_WATCH_PID=$!
# Wait a moment for build:watch to start and potentially rebuild
echo "Waiting for build:watch to stabilize..."
sleep 3
# Verify remoteEntry.js still exists (build:watch might have rebuilt it)
MAX_WAIT=30
WAIT_COUNT=0
while [ ! -f "$REMOTE_ENTRY_PATH" ] && [ $WAIT_COUNT -lt $MAX_WAIT ]; do
sleep 1
WAIT_COUNT=$((WAIT_COUNT + 1))
if [ $((WAIT_COUNT % 5)) -eq 0 ]; then
echo "Waiting for remoteEntry.js after build:watch... (${WAIT_COUNT}s)"
fi
done
if [ ! -f "$REMOTE_ENTRY_PATH" ]; then
echo "ERROR: remoteEntry.js was not available after ${MAX_WAIT} seconds!"
kill $BUILD_WATCH_PID 2>/dev/null || true
exit 1
fi
# Wait a bit more to ensure build:watch has finished any initial rebuild
echo "Ensuring build:watch has completed initial build..."
sleep 2
# Check file size to ensure it's not empty or being written
FILE_SIZE=$(stat -f%z "$REMOTE_ENTRY_PATH" 2>/dev/null || stat -c%s "$REMOTE_ENTRY_PATH" 2>/dev/null || echo "0")
if [ "$FILE_SIZE" -lt 100 ]; then
echo "WARNING: remoteEntry.js seems too small (${FILE_SIZE} bytes), waiting a bit more..."
sleep 2
fi
echo "remoteEntry.js is ready (${FILE_SIZE} bytes). Starting http-server..."
# Start http-server and give it time to fully initialize
# Use a simple approach: start server and wait a moment for it to be ready
exec npx http-server dist -p 3003 --cors -c-1

View File

@@ -10,7 +10,7 @@ POSTGRES_PASSWORD="${POSTGRES_PASSWORD:-your-super-secret-and-long-postgres-pass
# Generate JWT tokens (anon and service_role)
# These are simplified versions - in production, use Supabase's key generation
ANON_KEY="${ANON_KEY:-eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJzdXBhYmFzZS1kZW1vIiwicm9sZSI6ImFub24iLCJleHAiOjE5ODM4MTI5OTZ9.CRXP1A7WOeoJeXxjNni43kdQwgnWNReilDMblYTn_I0}"
ANON_KEY="${ANON_KEY:-[REDACTED]}"
SERVICE_KEY="${SERVICE_KEY:-eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJzdXBhYmFzZS1kZW1vIiwicm9sZSI6InNlcnZpY2Vfcm9sZSIsImV4cCI6MTk4MzgxMjk5Nn0.EGIM96RAZx35lJzdJsyH-qQwv8Hdp7fsn3W0YpN81IU}"
echo "Supabase Configuration for Docker Compose"

14
secrets.nix Normal file
View File

@@ -0,0 +1,14 @@
# Ragenix Configuration
# This file defines which secrets to manage and their permissions
let
# Import public keys from secrets.nix
keys = import ./secrets.nix;
in
{
# Main environment file
"env.age".publicKeys = keys.publicKeys;
# Azure OAuth configuration
"env.azure.age".publicKeys = keys.publicKeys;
}

11
secrets/.gitignore vendored Normal file
View File

@@ -0,0 +1,11 @@
# Ignore unencrypted secrets
*.env
!*.env.example
.env.*
!.env.*.example
# Ignore age private keys (if accidentally placed here)
*.txt
# Keep encrypted files
!*.age

75
secrets/README.md Normal file
View File

@@ -0,0 +1,75 @@
# USDA Vision Secrets Management
This directory contains encrypted secrets managed by [ragenix](https://github.com/yaxitech/ragenix).
## Setup
1. **Generate an age key** (if you don't have one):
```bash
# Generate a new age key
age-keygen -o ~/.config/age/keys.txt
# Or convert your SSH key
ssh-to-age < ~/.ssh/id_ed25519.pub
```
2. **Add your public key to `secrets.nix`**:
```nix
{
publicKeys = [
"age1..." # Your age public key
"ssh-ed25519 ..." # Or your SSH public key
];
}
```
3. **Create and encrypt environment files**:
```bash
# Create the encrypted .env file
ragenix -e secrets/env.age
# Create the encrypted .env.azure file
ragenix -e secrets/env.azure.age
```
## Usage in Development
In the development shell:
```bash
# Edit encrypted secrets
ragenix -e secrets/env.age
# Re-key secrets after adding a new public key
ragenix -r
```
## Usage in NixOS
The flake's NixOS module automatically handles decryption:
```nix
{
services.usda-vision = {
enable = true;
secretsFile = config.age.secrets.usda-vision-env.path;
};
age.secrets.usda-vision-env = {
file = ./usda-vision/secrets/env.age;
mode = "0644";
};
}
```
## Files
- `secrets.nix` - Public keys configuration
- `env.age` - Encrypted main .env file
- `env.azure.age` - Encrypted Azure OAuth configuration
- `README.md` - This file
## Security Notes
- Never commit unencrypted `.env` files
- Keep your age private key secure (`~/.config/age/keys.txt`)
- The `.age` encrypted files are safe to commit to git

14
secrets/secrets.nix Normal file
View File

@@ -0,0 +1,14 @@
# Public keys for secret encryption
# Add your age or SSH public keys here
{
publicKeys = [
# Example age public key:
# "age1qyqszqgpqyqszqgpqyqszqgpqyqszqgpqyqszqgpqyqszqgpqyqs3ekg8p"
# Example SSH public key (ed25519):
# "ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAI... user@host"
# Add your keys below:
# TODO: Add your age or SSH public keys
];
}

77
setup-dev.sh Executable file
View File

@@ -0,0 +1,77 @@
#!/usr/bin/env bash
# Quick setup script for USDA Vision development
set -e
echo "======================================"
echo "USDA Vision - Quick Setup"
echo "======================================"
echo ""
# Check if we're in the right directory
if [ ! -f "flake.nix" ]; then
echo "❌ Error: Must run from usda-vision directory"
echo " cd to the directory containing flake.nix"
exit 1
fi
# Check for age key
if [ ! -f "$HOME/.config/age/keys.txt" ]; then
echo "📝 No age key found at ~/.config/age/keys.txt"
echo ""
read -p "Would you like to generate one? (y/n) " -n 1 -r
echo ""
if [[ $REPLY =~ ^[Yy]$ ]]; then
mkdir -p "$HOME/.config/age"
age-keygen -o "$HOME/.config/age/keys.txt"
echo "✅ Age key generated!"
echo ""
else
echo "❌ Cannot proceed without an age key"
exit 1
fi
fi
# Get public key
AGE_PUBLIC_KEY=$(grep "public key:" "$HOME/.config/age/keys.txt" | cut -d: -f2 | xargs)
echo "Your age public key is:"
echo " $AGE_PUBLIC_KEY"
echo ""
# Check if key is already in secrets.nix
if grep -q "$AGE_PUBLIC_KEY" secrets/secrets.nix 2>/dev/null; then
echo "✅ Your key is already in secrets/secrets.nix"
else
echo "⚠️ Your key is NOT in secrets/secrets.nix"
echo ""
read -p "Would you like to add it now? (y/n) " -n 1 -r
echo ""
if [[ $REPLY =~ ^[Yy]$ ]]; then
# Backup original
cp secrets/secrets.nix secrets/secrets.nix.backup
# Add the key
sed -i "/publicKeys = \[/a\ \"$AGE_PUBLIC_KEY\"" secrets/secrets.nix
echo "✅ Key added to secrets/secrets.nix"
echo ""
fi
fi
echo "======================================"
echo "Setup complete! Next steps:"
echo "======================================"
echo ""
echo "1. Enter development environment:"
echo " $ nix develop"
echo ""
echo "2. Create/edit encrypted secrets:"
echo " $ ragenix -e secrets/env.age"
echo " $ ragenix -e secrets/env.azure.age"
echo ""
echo "3. Start development:"
echo " $ docker-compose up -d"
echo ""
echo "For more information, see FLAKE_SETUP.md"
echo ""

8
supabase/.gitignore vendored Normal file
View File

@@ -0,0 +1,8 @@
# Supabase
.branches
.temp
# dotenvx
.env.keys
.env.local
.env.*.local

View File

@@ -64,9 +64,9 @@ supabase gen types typescript --local > management-dashboard-web-app/src/types/s
## Seed Data
Seed files are run automatically after migrations when using docker-compose. They populate the database with initial data:
- `seed_01_users.sql`: Creates admin user and initial user profiles
- `seed_02_phase2_experiments.sql`: Creates initial experiment data
Seed files are run automatically after migrations when using `supabase db reset` (see `config.toml` → `[db.seed]` → `sql_paths`). Currently only user seed is enabled:
- `seed_01_users.sql`: Creates admin user and initial user profiles (enabled)
- `seed_02_phase2_experiments.sql`: Initial experiment data (temporarily disabled; add back to `sql_paths` in `config.toml` to re-enable)
## Configuration

View File

@@ -0,0 +1,14 @@
-- Test migration to create experiment_phases table
CREATE TABLE IF NOT EXISTS public.experiment_phases (
id UUID PRIMARY KEY DEFAULT uuid_generate_v4(),
name TEXT NOT NULL UNIQUE,
description TEXT,
created_at TIMESTAMP WITH TIME ZONE DEFAULT NOW(),
updated_at TIMESTAMP WITH TIME ZONE DEFAULT NOW(),
created_by UUID NOT NULL
);
-- Insert test data
INSERT INTO public.experiment_phases (name, description, created_by)
VALUES ('Phase 2 of JC Experiments', 'Second phase of JC Cracker experiments', '00000000-0000-0000-0000-000000000000')
ON CONFLICT (name) DO NOTHING;

View File

@@ -57,7 +57,9 @@ schema_paths = []
enabled = true
# Specifies an ordered list of seed files to load during db reset.
# Supports glob patterns relative to supabase directory: "./seeds/*.sql"
sql_paths = ["./seed_01_users.sql", "./seed_02_phase2_experiments.sql"]
# Temporarily only user seed; other seeds suppressed.
sql_paths = ["./seed_01_users.sql"]
# sql_paths = ["./seed_01_users.sql", "./seed_02_phase2_experiments.sql"]
# , "./seed_04_phase2_jc_experiments.sql", "./seed_05_meyer_experiments.sql"]
[db.network_restrictions]
@@ -278,6 +280,18 @@ url = ""
# If enabled, the nonce check will be skipped. Required for local sign in with Google auth.
skip_nonce_check = false
[auth.external.azure]
enabled = "env(VITE_ENABLE_MICROSOFT_LOGIN)"
client_id = "env(AZURE_CLIENT_ID)"
# DO NOT commit your OAuth provider secret to git. Use environment variable substitution instead:
secret = "env(AZURE_CLIENT_SECRET)"
# Overrides the default auth redirectUrl.
redirect_uri = "env(AZURE_REDIRECT_URI)"
# Azure tenant ID or 'common' for multi-tenant. Use 'common', 'organizations', 'consumers', or your specific tenant ID.
url = "env(AZURE_TENANT_URL)"
# If enabled, the nonce check will be skipped.
skip_nonce_check = false
# Allow Solana wallet holders to sign in to your project via the Sign in with Solana (SIWS, EIP-4361) standard.
# You can configure "web3" rate limit in the [auth.rate_limit] section and set up [auth.captcha] if self-hosting.
[auth.web3.solana]

View File

@@ -0,0 +1,46 @@
-- OAuth User Synchronization
-- This migration adds functionality to automatically create user profiles when users sign up via OAuth
-- =============================================
-- 1. CREATE FUNCTION FOR OAUTH USER AUTO-PROFILE CREATION
-- =============================================
CREATE OR REPLACE FUNCTION public.handle_new_oauth_user()
RETURNS TRIGGER AS $$
BEGIN
-- Check if user profile already exists
IF NOT EXISTS (
SELECT 1 FROM public.user_profiles WHERE id = NEW.id
) THEN
-- Create user profile with default active status
INSERT INTO public.user_profiles (id, email, status)
VALUES (
NEW.id,
NEW.email,
'active'
)
ON CONFLICT (id) DO NOTHING;
END IF;
RETURN NEW;
END;
$$ LANGUAGE plpgsql SECURITY DEFINER;
-- =============================================
-- 2. CREATE TRIGGER FOR NEW AUTH USERS
-- =============================================
-- Drop the trigger if it exists to avoid conflicts
DROP TRIGGER IF EXISTS on_auth_user_created ON auth.users;
-- Create trigger that fires after a new user is created in auth.users
CREATE TRIGGER on_auth_user_created
AFTER INSERT ON auth.users
FOR EACH ROW EXECUTE FUNCTION public.handle_new_oauth_user();
-- =============================================
-- 3. COMMENT FOR DOCUMENTATION
-- =============================================
COMMENT ON FUNCTION public.handle_new_oauth_user() IS
'Automatically creates a user profile in public.user_profiles when a new user is created via OAuth in auth.users. This ensures OAuth users are immediately accessible in the application without manual provisioning.';

View File

@@ -70,6 +70,10 @@ CREATE TABLE IF NOT EXISTS public.shelling (
scheduled_start_time TIMESTAMP WITH TIME ZONE NOT NULL,
actual_start_time TIMESTAMP WITH TIME ZONE,
actual_end_time TIMESTAMP WITH TIME ZONE,
-- The space (in inches) between the sheller's rings
ring_gap_inches NUMERIC(6,2) CHECK (ring_gap_inches > 0),
-- The revolutions per minute for the sheller drum
drum_rpm INTEGER CHECK (drum_rpm > 0),
created_at TIMESTAMP WITH TIME ZONE DEFAULT NOW(),
updated_at TIMESTAMP WITH TIME ZONE DEFAULT NOW(),
created_by UUID NOT NULL REFERENCES public.user_profiles(id),

View File

@@ -56,6 +56,80 @@ CREATE INDEX IF NOT EXISTS idx_phase_executions_machine_type_id
CREATE INDEX IF NOT EXISTS idx_phase_executions_created_by
ON public.experiment_phase_executions(created_by);
-- =============================================
-- 2.5. CREATE CONDUCTOR ASSIGNMENTS TABLE
-- =============================================
-- Table to store conductor assignments to phase executions
-- This allows multiple conductors to be assigned to each phase execution
CREATE TABLE IF NOT EXISTS public.experiment_phase_assignments (
id UUID PRIMARY KEY DEFAULT uuid_generate_v4(),
phase_execution_id UUID NOT NULL REFERENCES public.experiment_phase_executions(id) ON DELETE CASCADE,
conductor_id UUID NOT NULL REFERENCES public.user_profiles(id) ON DELETE CASCADE,
-- Scheduled times for this assignment (should match phase_execution times, but stored for clarity)
scheduled_start_time TIMESTAMP WITH TIME ZONE NOT NULL,
scheduled_end_time TIMESTAMP WITH TIME ZONE,
-- Status tracking
status TEXT NOT NULL DEFAULT 'scheduled'
CHECK (status IN ('scheduled', 'in_progress', 'completed', 'cancelled')),
-- Optional notes about the assignment
notes TEXT,
created_at TIMESTAMP WITH TIME ZONE DEFAULT NOW(),
updated_at TIMESTAMP WITH TIME ZONE DEFAULT NOW(),
created_by UUID NOT NULL REFERENCES public.user_profiles(id),
-- Ensure scheduled_end_time is after scheduled_start_time
CONSTRAINT valid_scheduled_time_range CHECK (scheduled_end_time IS NULL OR scheduled_end_time > scheduled_start_time),
-- Ensure unique assignment per conductor per phase execution
CONSTRAINT unique_conductor_phase_execution UNIQUE (phase_execution_id, conductor_id)
);
-- Indexes for conductor assignments
CREATE INDEX IF NOT EXISTS idx_phase_assignments_phase_execution_id
ON public.experiment_phase_assignments(phase_execution_id);
CREATE INDEX IF NOT EXISTS idx_phase_assignments_conductor_id
ON public.experiment_phase_assignments(conductor_id);
CREATE INDEX IF NOT EXISTS idx_phase_assignments_status
ON public.experiment_phase_assignments(status);
CREATE INDEX IF NOT EXISTS idx_phase_assignments_scheduled_start_time
ON public.experiment_phase_assignments(scheduled_start_time);
CREATE INDEX IF NOT EXISTS idx_phase_assignments_created_by
ON public.experiment_phase_assignments(created_by);
-- Trigger for updated_at on conductor assignments
CREATE TRIGGER set_updated_at_phase_assignments
BEFORE UPDATE ON public.experiment_phase_assignments
FOR EACH ROW
EXECUTE FUNCTION public.handle_updated_at();
-- Grant permissions
GRANT ALL ON public.experiment_phase_assignments TO authenticated;
-- Enable Row Level Security
ALTER TABLE public.experiment_phase_assignments ENABLE ROW LEVEL SECURITY;
-- RLS Policies for conductor assignments
CREATE POLICY "Phase assignments are viewable by authenticated users"
ON public.experiment_phase_assignments
FOR SELECT USING (auth.role() = 'authenticated');
CREATE POLICY "Phase assignments are insertable by authenticated users"
ON public.experiment_phase_assignments
FOR INSERT WITH CHECK (auth.role() = 'authenticated');
CREATE POLICY "Phase assignments are updatable by authenticated users"
ON public.experiment_phase_assignments
FOR UPDATE USING (auth.role() = 'authenticated');
CREATE POLICY "Phase assignments are deletable by authenticated users"
ON public.experiment_phase_assignments
FOR DELETE USING (auth.role() = 'authenticated');
-- =============================================
-- 3. FUNCTION: Calculate Sequential Phase Start Times
-- =============================================

View File

@@ -0,0 +1,9 @@
-- Add experiment-level shelling parameters (defaults for repetitions)
-- These match the shelling table attributes: ring_gap_inches, drum_rpm
ALTER TABLE public.experiments
ADD COLUMN IF NOT EXISTS ring_gap_inches NUMERIC(6,2) CHECK (ring_gap_inches IS NULL OR ring_gap_inches > 0),
ADD COLUMN IF NOT EXISTS drum_rpm INTEGER CHECK (drum_rpm IS NULL OR drum_rpm > 0);
COMMENT ON COLUMN public.experiments.ring_gap_inches IS 'Default space (inches) between sheller rings for this experiment';
COMMENT ON COLUMN public.experiments.drum_rpm IS 'Default sheller drum revolutions per minute for this experiment';

View File

@@ -0,0 +1,399 @@
-- Rename table experiment_phases to experiment_books
-- This migration renames the table and updates all dependent objects (views, functions, triggers, indexes, RLS).
-- =============================================
-- 1. RENAME TABLE
-- =============================================
ALTER TABLE public.experiment_phases RENAME TO experiment_books;
-- =============================================
-- 2. RENAME TRIGGER
-- =============================================
DROP TRIGGER IF EXISTS set_updated_at_experiment_phases ON public.experiment_books;
CREATE TRIGGER set_updated_at_experiment_books
BEFORE UPDATE ON public.experiment_books
FOR EACH ROW
EXECUTE FUNCTION public.handle_updated_at();
-- =============================================
-- 3. RENAME CONSTRAINT
-- =============================================
ALTER TABLE public.experiment_books
RENAME CONSTRAINT ck_experiment_phases_machine_required_when_cracking
TO ck_experiment_books_machine_required_when_cracking;
-- =============================================
-- 4. RENAME INDEXES
-- =============================================
ALTER INDEX IF EXISTS public.idx_experiment_phases_name RENAME TO idx_experiment_books_name;
ALTER INDEX IF EXISTS public.idx_experiment_phases_cracking_machine_type_id RENAME TO idx_experiment_books_cracking_machine_type_id;
-- =============================================
-- 5. RLS POLICIES (drop old, create new with updated names)
-- =============================================
DROP POLICY IF EXISTS "Experiment phases are viewable by authenticated users" ON public.experiment_books;
DROP POLICY IF EXISTS "Experiment phases are insertable by authenticated users" ON public.experiment_books;
DROP POLICY IF EXISTS "Experiment phases are updatable by authenticated users" ON public.experiment_books;
DROP POLICY IF EXISTS "Experiment phases are deletable by authenticated users" ON public.experiment_books;
CREATE POLICY "Experiment books are viewable by authenticated users" ON public.experiment_books
FOR SELECT USING (auth.role() = 'authenticated');
CREATE POLICY "Experiment books are insertable by authenticated users" ON public.experiment_books
FOR INSERT WITH CHECK (auth.role() = 'authenticated');
CREATE POLICY "Experiment books are updatable by authenticated users" ON public.experiment_books
FOR UPDATE USING (auth.role() = 'authenticated');
CREATE POLICY "Experiment books are deletable by authenticated users" ON public.experiment_books
FOR DELETE USING (auth.role() = 'authenticated');
-- =============================================
-- 6. UPDATE FUNCTION: create_phase_executions_for_repetition (references experiment_phases)
-- =============================================
CREATE OR REPLACE FUNCTION create_phase_executions_for_repetition()
RETURNS TRIGGER AS $$
DECLARE
exp_phase_config RECORD;
phase_type_list TEXT[] := ARRAY[]::TEXT[];
phase_name TEXT;
BEGIN
SELECT
ep.has_soaking,
ep.has_airdrying,
ep.has_cracking,
ep.has_shelling,
ep.cracking_machine_type_id
INTO exp_phase_config
FROM public.experiments e
JOIN public.experiment_books ep ON e.phase_id = ep.id
WHERE e.id = NEW.experiment_id;
IF exp_phase_config.has_soaking THEN
phase_type_list := array_append(phase_type_list, 'soaking');
END IF;
IF exp_phase_config.has_airdrying THEN
phase_type_list := array_append(phase_type_list, 'airdrying');
END IF;
IF exp_phase_config.has_cracking THEN
phase_type_list := array_append(phase_type_list, 'cracking');
END IF;
IF exp_phase_config.has_shelling THEN
phase_type_list := array_append(phase_type_list, 'shelling');
END IF;
FOREACH phase_name IN ARRAY phase_type_list
LOOP
INSERT INTO public.experiment_phase_executions (
repetition_id,
phase_type,
scheduled_start_time,
status,
created_by,
soaking_duration_minutes,
duration_minutes,
machine_type_id
)
VALUES (
NEW.id,
phase_name,
NOW(),
'pending',
NEW.created_by,
NULL,
NULL,
CASE WHEN phase_name = 'cracking'
THEN exp_phase_config.cracking_machine_type_id
ELSE NULL END
);
END LOOP;
RETURN NEW;
END;
$$ LANGUAGE plpgsql;
-- =============================================
-- 7. UPDATE FUNCTION: create_sample_experiment_phases (INSERT into experiment_books)
-- =============================================
CREATE OR REPLACE FUNCTION public.create_sample_experiment_phases()
RETURNS VOID AS $$
DECLARE
jc_cracker_id UUID;
meyer_cracker_id UUID;
BEGIN
SELECT id INTO jc_cracker_id FROM public.machine_types WHERE name = 'JC Cracker';
SELECT id INTO meyer_cracker_id FROM public.machine_types WHERE name = 'Meyer Cracker';
INSERT INTO public.experiment_books (name, description, has_soaking, has_airdrying, has_cracking, has_shelling, cracking_machine_type_id, created_by) VALUES
('Full Process - JC Cracker', 'Complete pecan processing with JC Cracker', true, true, true, true, jc_cracker_id, (SELECT id FROM public.user_profiles LIMIT 1)),
('Full Process - Meyer Cracker', 'Complete pecan processing with Meyer Cracker', true, true, true, true, meyer_cracker_id, (SELECT id FROM public.user_profiles LIMIT 1)),
('Cracking Only - JC Cracker', 'JC Cracker cracking process only', false, false, true, false, jc_cracker_id, (SELECT id FROM public.user_profiles LIMIT 1)),
('Cracking Only - Meyer Cracker', 'Meyer Cracker cracking process only', false, false, true, false, meyer_cracker_id, (SELECT id FROM public.user_profiles LIMIT 1))
ON CONFLICT (name) DO NOTHING;
END;
$$ LANGUAGE plpgsql SECURITY DEFINER;
-- =============================================
-- 8. UPDATE VIEWS (from 00014 - experiments_with_phases, repetitions_with_phases, experiments_with_all_reps_and_phases, get_experiment_with_reps_and_phases)
-- =============================================
CREATE OR REPLACE VIEW public.experiments_with_phases AS
SELECT
e.id,
e.experiment_number,
e.reps_required,
e.weight_per_repetition_lbs,
e.results_status,
e.completion_status,
e.phase_id,
e.created_at,
e.updated_at,
e.created_by,
ep.name as phase_name,
ep.description as phase_description,
ep.has_soaking,
ep.has_airdrying,
ep.has_cracking,
ep.has_shelling,
er.id as first_repetition_id,
er.repetition_number as first_repetition_number,
soaking_e.id as soaking_id,
soaking_e.scheduled_start_time as soaking_scheduled_start,
soaking_e.actual_start_time as soaking_actual_start,
soaking_e.soaking_duration_minutes,
soaking_e.scheduled_end_time as soaking_scheduled_end,
soaking_e.actual_end_time as soaking_actual_end,
airdrying_e.id as airdrying_id,
airdrying_e.scheduled_start_time as airdrying_scheduled_start,
airdrying_e.actual_start_time as airdrying_actual_start,
airdrying_e.duration_minutes as airdrying_duration,
airdrying_e.scheduled_end_time as airdrying_scheduled_end,
airdrying_e.actual_end_time as airdrying_actual_end,
cracking_e.id as cracking_id,
cracking_e.scheduled_start_time as cracking_scheduled_start,
cracking_e.actual_start_time as cracking_actual_start,
cracking_e.actual_end_time as cracking_actual_end,
mt.name as machine_type_name,
shelling_e.id as shelling_id,
shelling_e.scheduled_start_time as shelling_scheduled_start,
shelling_e.actual_start_time as shelling_actual_start,
shelling_e.actual_end_time as shelling_actual_end
FROM public.experiments e
LEFT JOIN public.experiment_books ep ON e.phase_id = ep.id
LEFT JOIN LATERAL (
SELECT id, repetition_number
FROM public.experiment_repetitions
WHERE experiment_id = e.id
ORDER BY repetition_number
LIMIT 1
) er ON true
LEFT JOIN public.experiment_phase_executions soaking_e
ON soaking_e.repetition_id = er.id AND soaking_e.phase_type = 'soaking'
LEFT JOIN public.experiment_phase_executions airdrying_e
ON airdrying_e.repetition_id = er.id AND airdrying_e.phase_type = 'airdrying'
LEFT JOIN public.experiment_phase_executions cracking_e
ON cracking_e.repetition_id = er.id AND cracking_e.phase_type = 'cracking'
LEFT JOIN public.experiment_phase_executions shelling_e
ON shelling_e.repetition_id = er.id AND shelling_e.phase_type = 'shelling'
LEFT JOIN public.machine_types mt ON cracking_e.machine_type_id = mt.id;
CREATE OR REPLACE VIEW public.repetitions_with_phases AS
SELECT
er.id,
er.experiment_id,
er.repetition_number,
er.status,
er.created_at,
er.updated_at,
er.created_by,
e.experiment_number,
e.phase_id,
e.weight_per_repetition_lbs,
ep.name as phase_name,
ep.has_soaking,
ep.has_airdrying,
ep.has_cracking,
ep.has_shelling,
soaking_e.scheduled_start_time as soaking_scheduled_start,
soaking_e.actual_start_time as soaking_actual_start,
soaking_e.soaking_duration_minutes,
soaking_e.scheduled_end_time as soaking_scheduled_end,
soaking_e.actual_end_time as soaking_actual_end,
airdrying_e.scheduled_start_time as airdrying_scheduled_start,
airdrying_e.actual_start_time as airdrying_actual_start,
airdrying_e.duration_minutes as airdrying_duration,
airdrying_e.scheduled_end_time as airdrying_scheduled_end,
airdrying_e.actual_end_time as airdrying_actual_end,
cracking_e.scheduled_start_time as cracking_scheduled_start,
cracking_e.actual_start_time as cracking_actual_start,
cracking_e.actual_end_time as cracking_actual_end,
mt.name as machine_type_name,
shelling_e.scheduled_start_time as shelling_scheduled_start,
shelling_e.actual_start_time as shelling_actual_start,
shelling_e.actual_end_time as shelling_actual_end
FROM public.experiment_repetitions er
JOIN public.experiments e ON er.experiment_id = e.id
LEFT JOIN public.experiment_books ep ON e.phase_id = ep.id
LEFT JOIN public.experiment_phase_executions soaking_e
ON er.id = soaking_e.repetition_id AND soaking_e.phase_type = 'soaking'
LEFT JOIN public.experiment_phase_executions airdrying_e
ON er.id = airdrying_e.repetition_id AND airdrying_e.phase_type = 'airdrying'
LEFT JOIN public.experiment_phase_executions cracking_e
ON er.id = cracking_e.repetition_id AND cracking_e.phase_type = 'cracking'
LEFT JOIN public.experiment_phase_executions shelling_e
ON er.id = shelling_e.repetition_id AND shelling_e.phase_type = 'shelling'
LEFT JOIN public.machine_types mt ON cracking_e.machine_type_id = mt.id;
-- experiments_with_all_reps_and_phases
CREATE OR REPLACE VIEW public.experiments_with_all_reps_and_phases AS
SELECT
e.id as experiment_id,
e.experiment_number,
e.reps_required,
e.weight_per_repetition_lbs,
e.results_status,
e.completion_status,
e.phase_id,
e.created_at as experiment_created_at,
e.updated_at as experiment_updated_at,
e.created_by as experiment_created_by,
ep.name as phase_name,
ep.description as phase_description,
ep.has_soaking,
ep.has_airdrying,
ep.has_cracking,
ep.has_shelling,
ep.cracking_machine_type_id as phase_cracking_machine_type_id,
er.id as repetition_id,
er.repetition_number,
er.status as repetition_status,
er.scheduled_date,
er.created_at as repetition_created_at,
er.updated_at as repetition_updated_at,
er.created_by as repetition_created_by,
soaking_e.id as soaking_execution_id,
soaking_e.scheduled_start_time as soaking_scheduled_start,
soaking_e.actual_start_time as soaking_actual_start,
soaking_e.soaking_duration_minutes,
soaking_e.scheduled_end_time as soaking_scheduled_end,
soaking_e.actual_end_time as soaking_actual_end,
soaking_e.status as soaking_status,
airdrying_e.id as airdrying_execution_id,
airdrying_e.scheduled_start_time as airdrying_scheduled_start,
airdrying_e.actual_start_time as airdrying_actual_start,
airdrying_e.duration_minutes as airdrying_duration_minutes,
airdrying_e.scheduled_end_time as airdrying_scheduled_end,
airdrying_e.actual_end_time as airdrying_actual_end,
airdrying_e.status as airdrying_status,
cracking_e.id as cracking_execution_id,
cracking_e.scheduled_start_time as cracking_scheduled_start,
cracking_e.actual_start_time as cracking_actual_start,
cracking_e.scheduled_end_time as cracking_scheduled_end,
cracking_e.actual_end_time as cracking_actual_end,
cracking_e.machine_type_id as cracking_machine_type_id,
cracking_e.status as cracking_status,
mt.name as machine_type_name,
shelling_e.id as shelling_execution_id,
shelling_e.scheduled_start_time as shelling_scheduled_start,
shelling_e.actual_start_time as shelling_actual_start,
shelling_e.scheduled_end_time as shelling_scheduled_end,
shelling_e.actual_end_time as shelling_actual_end,
shelling_e.status as shelling_status
FROM public.experiments e
LEFT JOIN public.experiment_books ep ON e.phase_id = ep.id
LEFT JOIN public.experiment_repetitions er ON er.experiment_id = e.id
LEFT JOIN public.experiment_phase_executions soaking_e
ON soaking_e.repetition_id = er.id AND soaking_e.phase_type = 'soaking'
LEFT JOIN public.experiment_phase_executions airdrying_e
ON airdrying_e.repetition_id = er.id AND airdrying_e.phase_type = 'airdrying'
LEFT JOIN public.experiment_phase_executions cracking_e
ON cracking_e.repetition_id = er.id AND cracking_e.phase_type = 'cracking'
LEFT JOIN public.experiment_phase_executions shelling_e
ON shelling_e.repetition_id = er.id AND shelling_e.phase_type = 'shelling'
LEFT JOIN public.machine_types mt ON cracking_e.machine_type_id = mt.id
ORDER BY e.experiment_number, er.repetition_number;
-- get_experiment_with_reps_and_phases function
CREATE OR REPLACE FUNCTION public.get_experiment_with_reps_and_phases(p_experiment_id UUID)
RETURNS TABLE (
experiment_id UUID,
experiment_number INTEGER,
phase_name TEXT,
repetitions JSONB
) AS $$
BEGIN
RETURN QUERY
SELECT
e.id,
e.experiment_number,
ep.name,
COALESCE(
jsonb_agg(
jsonb_build_object(
'repetition_id', er.id,
'repetition_number', er.repetition_number,
'status', er.status,
'scheduled_date', er.scheduled_date,
'soaking', jsonb_build_object(
'scheduled_start', soaking_e.scheduled_start_time,
'actual_start', soaking_e.actual_start_time,
'duration_minutes', soaking_e.soaking_duration_minutes,
'scheduled_end', soaking_e.scheduled_end_time,
'actual_end', soaking_e.actual_end_time,
'status', soaking_e.status
),
'airdrying', jsonb_build_object(
'scheduled_start', airdrying_e.scheduled_start_time,
'actual_start', airdrying_e.actual_start_time,
'duration_minutes', airdrying_e.duration_minutes,
'scheduled_end', airdrying_e.scheduled_end_time,
'actual_end', airdrying_e.actual_end_time,
'status', airdrying_e.status
),
'cracking', jsonb_build_object(
'scheduled_start', cracking_e.scheduled_start_time,
'actual_start', cracking_e.actual_start_time,
'scheduled_end', cracking_e.scheduled_end_time,
'actual_end', cracking_e.actual_end_time,
'machine_type_id', cracking_e.machine_type_id,
'machine_type_name', mt.name,
'status', cracking_e.status
),
'shelling', jsonb_build_object(
'scheduled_start', shelling_e.scheduled_start_time,
'actual_start', shelling_e.actual_start_time,
'scheduled_end', shelling_e.scheduled_end_time,
'actual_end', shelling_e.actual_end_time,
'status', shelling_e.status
)
)
ORDER BY er.repetition_number
),
'[]'::jsonb
) as repetitions
FROM public.experiments e
LEFT JOIN public.experiment_books ep ON e.phase_id = ep.id
LEFT JOIN public.experiment_repetitions er ON er.experiment_id = e.id
LEFT JOIN public.experiment_phase_executions soaking_e
ON soaking_e.repetition_id = er.id AND soaking_e.phase_type = 'soaking'
LEFT JOIN public.experiment_phase_executions airdrying_e
ON airdrying_e.repetition_id = er.id AND airdrying_e.phase_type = 'airdrying'
LEFT JOIN public.experiment_phase_executions cracking_e
ON cracking_e.repetition_id = er.id AND cracking_e.phase_type = 'cracking'
LEFT JOIN public.experiment_phase_executions shelling_e
ON shelling_e.repetition_id = er.id AND shelling_e.phase_type = 'shelling'
LEFT JOIN public.machine_types mt ON cracking_e.machine_type_id = mt.id
WHERE e.id = p_experiment_id
GROUP BY e.id, e.experiment_number, ep.name;
END;
$$ LANGUAGE plpgsql SECURITY DEFINER;
GRANT SELECT ON public.experiments_with_all_reps_and_phases TO authenticated;
GRANT EXECUTE ON FUNCTION public.get_experiment_with_reps_and_phases(UUID) TO authenticated;

View File

@@ -0,0 +1,118 @@
-- Experiment-level phase config tables
-- One row per experiment per phase; linked by experiment_id. Used when creating an experiment
-- so soaking, airdrying, cracking, and shelling parameters are stored and can be applied to repetitions.
-- =============================================
-- 1. EXPERIMENT_SOAKING (template for soaking phase)
-- =============================================
CREATE TABLE IF NOT EXISTS public.experiment_soaking (
id UUID PRIMARY KEY DEFAULT uuid_generate_v4(),
experiment_id UUID NOT NULL REFERENCES public.experiments(id) ON DELETE CASCADE,
soaking_duration_hr DOUBLE PRECISION NOT NULL CHECK (soaking_duration_hr >= 0),
created_at TIMESTAMP WITH TIME ZONE DEFAULT NOW(),
updated_at TIMESTAMP WITH TIME ZONE DEFAULT NOW(),
created_by UUID NOT NULL REFERENCES public.user_profiles(id),
CONSTRAINT unique_experiment_soaking_per_experiment UNIQUE (experiment_id)
);
CREATE INDEX IF NOT EXISTS idx_experiment_soaking_experiment_id ON public.experiment_soaking(experiment_id);
GRANT ALL ON public.experiment_soaking TO authenticated;
ALTER TABLE public.experiment_soaking ENABLE ROW LEVEL SECURITY;
CREATE POLICY "Experiment soaking config is viewable by authenticated" ON public.experiment_soaking FOR SELECT USING (auth.role() = 'authenticated');
CREATE POLICY "Experiment soaking config is insertable by authenticated" ON public.experiment_soaking FOR INSERT WITH CHECK (auth.role() = 'authenticated');
CREATE POLICY "Experiment soaking config is updatable by authenticated" ON public.experiment_soaking FOR UPDATE USING (auth.role() = 'authenticated');
CREATE POLICY "Experiment soaking config is deletable by authenticated" ON public.experiment_soaking FOR DELETE USING (auth.role() = 'authenticated');
CREATE TRIGGER set_updated_at_experiment_soaking
BEFORE UPDATE ON public.experiment_soaking
FOR EACH ROW EXECUTE FUNCTION public.handle_updated_at();
-- =============================================
-- 2. EXPERIMENT_AIRDRYING (template for airdrying phase)
-- =============================================
CREATE TABLE IF NOT EXISTS public.experiment_airdrying (
id UUID PRIMARY KEY DEFAULT uuid_generate_v4(),
experiment_id UUID NOT NULL REFERENCES public.experiments(id) ON DELETE CASCADE,
duration_minutes INTEGER NOT NULL CHECK (duration_minutes >= 0),
created_at TIMESTAMP WITH TIME ZONE DEFAULT NOW(),
updated_at TIMESTAMP WITH TIME ZONE DEFAULT NOW(),
created_by UUID NOT NULL REFERENCES public.user_profiles(id),
CONSTRAINT unique_experiment_airdrying_per_experiment UNIQUE (experiment_id)
);
CREATE INDEX IF NOT EXISTS idx_experiment_airdrying_experiment_id ON public.experiment_airdrying(experiment_id);
GRANT ALL ON public.experiment_airdrying TO authenticated;
ALTER TABLE public.experiment_airdrying ENABLE ROW LEVEL SECURITY;
CREATE POLICY "Experiment airdrying config is viewable by authenticated" ON public.experiment_airdrying FOR SELECT USING (auth.role() = 'authenticated');
CREATE POLICY "Experiment airdrying config is insertable by authenticated" ON public.experiment_airdrying FOR INSERT WITH CHECK (auth.role() = 'authenticated');
CREATE POLICY "Experiment airdrying config is updatable by authenticated" ON public.experiment_airdrying FOR UPDATE USING (auth.role() = 'authenticated');
CREATE POLICY "Experiment airdrying config is deletable by authenticated" ON public.experiment_airdrying FOR DELETE USING (auth.role() = 'authenticated');
CREATE TRIGGER set_updated_at_experiment_airdrying
BEFORE UPDATE ON public.experiment_airdrying
FOR EACH ROW EXECUTE FUNCTION public.handle_updated_at();
-- =============================================
-- 3. EXPERIMENT_CRACKING (template for cracking; supports JC and Meyer params)
-- =============================================
CREATE TABLE IF NOT EXISTS public.experiment_cracking (
id UUID PRIMARY KEY DEFAULT uuid_generate_v4(),
experiment_id UUID NOT NULL REFERENCES public.experiments(id) ON DELETE CASCADE,
machine_type_id UUID NOT NULL REFERENCES public.machine_types(id) ON DELETE RESTRICT,
-- JC Cracker parameters (nullable; used when machine is JC)
plate_contact_frequency_hz DOUBLE PRECISION CHECK (plate_contact_frequency_hz IS NULL OR plate_contact_frequency_hz > 0),
throughput_rate_pecans_sec DOUBLE PRECISION CHECK (throughput_rate_pecans_sec IS NULL OR throughput_rate_pecans_sec > 0),
crush_amount_in DOUBLE PRECISION CHECK (crush_amount_in IS NULL OR crush_amount_in >= 0),
entry_exit_height_diff_in DOUBLE PRECISION,
-- Meyer Cracker parameters (nullable; used when machine is Meyer)
motor_speed_hz DOUBLE PRECISION CHECK (motor_speed_hz IS NULL OR motor_speed_hz > 0),
jig_displacement_inches DOUBLE PRECISION CHECK (jig_displacement_inches IS NULL OR jig_displacement_inches >= 0),
spring_stiffness_nm DOUBLE PRECISION CHECK (spring_stiffness_nm IS NULL OR spring_stiffness_nm > 0),
created_at TIMESTAMP WITH TIME ZONE DEFAULT NOW(),
updated_at TIMESTAMP WITH TIME ZONE DEFAULT NOW(),
created_by UUID NOT NULL REFERENCES public.user_profiles(id),
CONSTRAINT unique_experiment_cracking_per_experiment UNIQUE (experiment_id)
);
CREATE INDEX IF NOT EXISTS idx_experiment_cracking_experiment_id ON public.experiment_cracking(experiment_id);
CREATE INDEX IF NOT EXISTS idx_experiment_cracking_machine_type_id ON public.experiment_cracking(machine_type_id);
GRANT ALL ON public.experiment_cracking TO authenticated;
ALTER TABLE public.experiment_cracking ENABLE ROW LEVEL SECURITY;
CREATE POLICY "Experiment cracking config is viewable by authenticated" ON public.experiment_cracking FOR SELECT USING (auth.role() = 'authenticated');
CREATE POLICY "Experiment cracking config is insertable by authenticated" ON public.experiment_cracking FOR INSERT WITH CHECK (auth.role() = 'authenticated');
CREATE POLICY "Experiment cracking config is updatable by authenticated" ON public.experiment_cracking FOR UPDATE USING (auth.role() = 'authenticated');
CREATE POLICY "Experiment cracking config is deletable by authenticated" ON public.experiment_cracking FOR DELETE USING (auth.role() = 'authenticated');
CREATE TRIGGER set_updated_at_experiment_cracking
BEFORE UPDATE ON public.experiment_cracking
FOR EACH ROW EXECUTE FUNCTION public.handle_updated_at();
-- =============================================
-- 4. EXPERIMENT_SHELLING (template for shelling phase)
-- =============================================
CREATE TABLE IF NOT EXISTS public.experiment_shelling (
id UUID PRIMARY KEY DEFAULT uuid_generate_v4(),
experiment_id UUID NOT NULL REFERENCES public.experiments(id) ON DELETE CASCADE,
ring_gap_inches NUMERIC(6,2) CHECK (ring_gap_inches IS NULL OR ring_gap_inches > 0),
drum_rpm INTEGER CHECK (drum_rpm IS NULL OR drum_rpm > 0),
created_at TIMESTAMP WITH TIME ZONE DEFAULT NOW(),
updated_at TIMESTAMP WITH TIME ZONE DEFAULT NOW(),
created_by UUID NOT NULL REFERENCES public.user_profiles(id),
CONSTRAINT unique_experiment_shelling_per_experiment UNIQUE (experiment_id)
);
CREATE INDEX IF NOT EXISTS idx_experiment_shelling_experiment_id ON public.experiment_shelling(experiment_id);
GRANT ALL ON public.experiment_shelling TO authenticated;
ALTER TABLE public.experiment_shelling ENABLE ROW LEVEL SECURITY;
CREATE POLICY "Experiment shelling config is viewable by authenticated" ON public.experiment_shelling FOR SELECT USING (auth.role() = 'authenticated');
CREATE POLICY "Experiment shelling config is insertable by authenticated" ON public.experiment_shelling FOR INSERT WITH CHECK (auth.role() = 'authenticated');
CREATE POLICY "Experiment shelling config is updatable by authenticated" ON public.experiment_shelling FOR UPDATE USING (auth.role() = 'authenticated');
CREATE POLICY "Experiment shelling config is deletable by authenticated" ON public.experiment_shelling FOR DELETE USING (auth.role() = 'authenticated');
CREATE TRIGGER set_updated_at_experiment_shelling
BEFORE UPDATE ON public.experiment_shelling
FOR EACH ROW EXECUTE FUNCTION public.handle_updated_at();

View File

@@ -566,11 +566,11 @@ INSERT INTO public.machine_types (name, description, created_by) VALUES
ON CONFLICT (name) DO NOTHING;
-- =============================================
-- 5. CREATE EXPERIMENT PHASES
-- 5. CREATE EXPERIMENT BOOKS (table renamed from experiment_phases in migration 00016)
-- =============================================
-- Create "Phase 2 of JC Experiments" phase
INSERT INTO public.experiment_phases (name, description, has_soaking, has_airdrying, has_cracking, has_shelling, cracking_machine_type_id, created_by)
-- Create "Phase 2 of JC Experiments" book
INSERT INTO public.experiment_books (name, description, has_soaking, has_airdrying, has_cracking, has_shelling, cracking_machine_type_id, created_by)
SELECT
'Phase 2 of JC Experiments',
'Second phase of JC Cracker experiments for pecan processing optimization',
@@ -584,8 +584,8 @@ FROM public.user_profiles up
WHERE up.email = 's.alireza.v@gmail.com'
;
-- Create "Post Workshop Meyer Experiments" phase
INSERT INTO public.experiment_phases (name, description, has_soaking, has_airdrying, has_cracking, has_shelling, cracking_machine_type_id, created_by)
-- Create "Post Workshop Meyer Experiments" book
INSERT INTO public.experiment_books (name, description, has_soaking, has_airdrying, has_cracking, has_shelling, cracking_machine_type_id, created_by)
SELECT
'Post Workshop Meyer Experiments',
'Post workshop Meyer Cracker experiments for pecan processing optimization',

View File

@@ -1,7 +1,9 @@
-- ==============================================
-- 6. CREATE EXPERIMENTS FOR PHASE 2
-- ==============================================
-- TEMPORARILY DISABLED (see config.toml sql_paths). When re-enabling, replace
-- all "experiment_phases" with "experiment_books" (table renamed in migration 00016).
--
-- This seed file creates experiments from phase_2_JC_experimental_run_sheet.csv
-- Each experiment has 3 repetitions with specific parameters
-- Experiment numbers are incremented by 1 (CSV 0-19 becomes DB 1-20)