# AI Master Prompt for Space Data Stream Analyzer
## 1. PROJECT OVERVIEW
**Application Name:** AstroStream AI
**Purpose:** To provide a cutting-edge SaaS platform for analyzing, processing, and visualizing high-resolution video and data streams from space missions, such as NASA's Artemis II.
**Problem Solved:** Analyzing vast amounts of complex data from space missions is time-consuming and requires specialized tools. Current methods often lack advanced AI capabilities for real-time insights and deep analysis.
**Value Proposition:** AstroStream AI empowers scientists, engineers, and space enthusiasts with AI-driven tools to extract meaningful insights from space mission data, enabling faster discovery, improved mission planning, and enhanced public engagement through unprecedented visualization and analysis capabilities. The platform will handle high-bandwidth data streams (e.g., 4K video at 260 Mbps) efficiently.
## 2. TECH STACK
- **Frontend Framework:** React.js (using functional components and hooks)
- **Styling:** Tailwind CSS (for rapid UI development and responsive design)
- **State Management:** Zustand (lightweight and efficient for global state)
- **Routing:** React Router DOM
- **Video Player:** Video.js or a custom solution leveraging HTML5 video API with WebRTC for potential real-time streaming.
- **Data Visualization:** Chart.js or Recharts for charting bandwidth, data rates, etc.
- **AI Integration:** Placeholder for API calls to a backend AI service (e.g., TensorFlow.js for on-device processing or a REST API for a cloud-based model).
- **API Communication:** Axios for HTTP requests.
- **Build Tool:** Vite (for fast development server and optimized builds)
- **Deployment:** Docker for containerization, Vercel/Netlify for frontend deployment.
## 3. CORE FEATURES
**3.1. High-Resolution Video Stream Uploader & Player:**
- **User Flow:** Users can upload pre-recorded video files or potentially connect to live streams (future enhancement).
- **Functionality:** A robust player that supports high-resolution formats (4K). It should display playback controls, time scrubber, volume, fullscreen, and quality settings. Handles large file uploads efficiently, possibly with chunking.
- **Details:** The player UI should be clean and unobtrusive. Loading states should be clearly indicated. Support for common codecs.
**3.2. AI-Powered Object Recognition & Analysis:**
- **User Flow:** After uploading a video, users can initiate an AI analysis. The system identifies predefined objects (e.g., lunar modules, astronauts, specific geological features, equipment) and their movements. Results are overlaid on the video or listed separately.
- **Functionality:** Integrates with an AI model (via API) to process video frames. Generates bounding boxes, labels, and confidence scores for detected objects. Tracks object movement over time.
- **Details:** Users can configure analysis parameters (e.g., object types to detect). Analysis progress is shown to the user.
**3.3. Data Stream Visualization Panel:**
- **User Flow:** Displays real-time or historical data related to the stream, such as bandwidth usage (e.g., 260 Mbps), data packet loss, latency, and system health metrics.
- **Functionality:** Utilizes charting libraries to present data in an easily digestible format (line graphs, gauges).
- **Details:** Customizable dashboard allowing users to select which metrics to display. Real-time data updates should be smooth.
**3.4. Automatic Key Moment Detection:**
- **User Flow:** The AI automatically identifies and timestamps significant events within the footage (e.g., rocket launch ignition, landing sequence, critical maneuvers, discoveries).
- **Functionality:** Analyzes video content and associated metadata (if available) to flag important moments. Creates a timeline of these events.
- **Details:** Users can review, confirm, reject, or manually add key moments. These moments can be bookmarked for quick access.
**3.5. Sharing & Reporting:**
- **User Flow:** Users can export analysis results, selected video clips with annotations, and generated charts. They can share these findings within their team or publicly.
- **Functionality:** Generates shareable links, downloadable reports (PDF), and allows exporting annotated video clips.
- **Details:** Options for different export formats and levels of detail.
## 4. UI/UX DESIGN
- **Layout:** Single Page Application (SPA) with a clear navigation structure. Main sections: Dashboard (overview), Video Library/Uploader, Analysis Workspace, Data Visualization, Settings.
- **Color Palette:**
- Primary: Deep Space Blue (`#0A192F`)
- Secondary: Nebula Purple (`#61DAFB` - React-inspired accent)
- Accent: Cosmic Orange (`#FF8C00`)
- Neutrals: Dark Gray (`#222`), Light Gray (`#888`), Off-White (`#F0F0F0`)
- **Typography:** Sans-serif fonts. Headings: Inter (Bold), Body Text: Inter (Regular). Ensure readability.
- **Responsiveness:** Mobile-first approach. Utilize Tailwind's responsive modifiers (`sm:`, `md:`, `lg:`) extensively. Ensure usability across various screen sizes.
- **Component Hierarchy:** Clean, modular structure. Reusable components for buttons, cards, modals, charts, etc.
- **Interactions:** Subtle animations on hover, focus, and loading states. Smooth transitions between views.
## 5. DATA MODEL
- **State Structure (Zustand Store):**
```javascript
// Example State Structure
{
auth: { user: null, isAuthenticated: false },
uploads: [
{ id: 'upload-1', name: 'Artemis_II_Launch.mp4', status: 'completed', progress: 100, videoUrl: '...', thumbnailUrl: '...' },
{ id: 'upload-2', name: 'Lunar_Surface_Scan.mov', status: 'uploading', progress: 45, videoUrl: null }
],
selectedVideo: { id: null, url: null, analysisData: null, keyMoments: [] },
analysisSettings: { objectTypes: ['rover', 'lander', 'astronaut'], sensitivity: 'medium' },
streamData: { bandwidth: 255, latency: 50, packetLoss: 0.1 }, // Example metrics
ui: { isLoading: false, error: null, modal: null }
}
```
- **Mock Data Formats:**
- **Analysis Result:**
```json
{
"timestamp": "00:05:32.123",
"detections": [
{
"box": [100, 200, 300, 400], // [x1, y1, x2, y2]
"label": "Lunar Lander",
"confidence": 0.92
},
{
"box": [500, 150, 650, 350],
"label": "Astronaut",
"confidence": 0.88
}
]
}
```
- **Key Moment:**
```json
{
"timestamp": "00:15:10.500",
"description": "First Step on Lunar Surface",
"type": "event", // 'event', 'discovery', 'anomaly'
"confirmed": true
}
```
- **Stream Metrics:**
```json
{
"timestamp": "2023-10-27T10:30:00Z",
"bandwidthMbps": 260.5,
"latencyMs": 45,
"packetLossPercent": 0.05
}
```
## 6. COMPONENT BREAKDOWN
- **`App.jsx`:** Main application component, sets up routing and global layout.
- **`Layout.jsx`:** Main layout wrapper, includes header, sidebar (if applicable), and content area.
- **`Header.jsx`:** Top navigation bar with logo, user profile, and main links.
- **`DashboardPage.jsx`:** Main landing page showing recent uploads, analysis summaries, and system status.
- **`VideoLibraryPage.jsx`:** Displays uploaded videos, includes search/filter functionality.
- **`UploadComponent.jsx`:** Handles video file uploading, including progress indication and chunking logic.
- **`VideoPlayer.jsx`:** Custom video player component integrating Video.js or similar.
- Props: `src`, `thumbnailUrl`, `options`, `onLoadedData`, `onTimeUpdate`.
- **`AnalysisWorkspacePage.jsx`:** The core page for performing and viewing analyses.
- Contains `VideoPlayer`, `AnalysisControls`, `ResultsOverlay`.
- **`AnalysisControls.jsx`:** UI elements for configuring and starting AI analysis.
- Props: `availableObjectTypes`, `currentSettings`, `onStartAnalysis`, `onSaveSettings`.
- **`ResultsOverlay.jsx`:** Renders AI detection bounding boxes and labels directly on the video player.
- Props: `detections`, `videoDimensions`.
- **`DataVisualizationPage.jsx`:** Displays charts and graphs for stream metrics.
- **`ChartComponent.jsx`:** Reusable chart component (e.g., Line chart for bandwidth).
- Props: `data`, `options`, `type` (e.g., 'line', 'gauge').
- **`KeyMomentsTimeline.jsx`:** Displays detected key moments with timestamps and descriptions.
- Props: `keyMoments`, `onAddMoment`, `onEditMoment`, `onDeleteMoment`.
- **`ShareReportModal.jsx`:** Modal for generating and sharing reports or clips.
- **`Auth Components`:** (`LoginPage`, `RegisterPage`, `UserProfile`) - Basic authentication UI.
- **`UI Components`:** (`Button`, `Card`, `Modal`, `Input`, `Spinner`, `Alert`) - Reusable UI elements styled with Tailwind.
## 7. ANIMATIONS & INTERACTIONS
- **Hover Effects:** Subtle background color changes or slight scaling on buttons, cards, and links.
- **Transitions:** Smooth transitions for modal pop-ups, sidebar menus, and route changes (e.g., fade-in/out using `Transition` component from `react-transition-group` or Framer Motion).
- **Loading States:** Implement skeleton loaders or spinners (`<Spinner />` component) for data fetching, video processing, and AI analysis. Use `ui.isLoading` state.
- **Micro-interactions:** Button click feedback, input field focus states, subtle animations on chart data loading.
- **Video Player:** Smooth playback, responsive controls that appear/disappear on interaction.
- **Progress Bars:** Animated progress bars for uploads and analysis tasks.
## 8. EDGE CASES
- **Empty States:** Display informative messages and clear calls-to-action when no videos are uploaded, no analysis data is available, or no key moments are detected.
- **Error Handling:**
- Gracefully handle API errors (e.g., network issues, server errors) with user-friendly messages using the `ui.error` state and `<Alert />` component.
- Handle invalid file uploads (wrong format, too large).
- Display specific errors during AI analysis failures.
- **Validation:**
- Input validation for forms (e.g., report descriptions, settings).
- Upload validation (file type, size limits).
- **Accessibility (a11y):**
- Use semantic HTML elements.
- Ensure sufficient color contrast.
- Implement ARIA attributes where necessary.
- Keyboard navigation support for all interactive elements.
- Alt text for images and informative labels for icons.
- **Data Loading:** Show loading indicators for all asynchronous operations. Prevent user interaction with disabled elements during loading.
- **Video Player:** Handle different video loading states (buffering, error, ended).
- **AI Analysis:** What happens if the AI model returns no detections? Display a message indicating no objects were found.
## 9. SAMPLE DATA
- **Video Uploads (Array in `uploads` state):
```json
[
{
"id": "artemis-ii-launch-001",
"name": "Artemis II Launch Sequence",
"status": "completed",
"progress": 100,
"videoUrl": "/api/videos/artemis-ii-launch-001.mp4",
"thumbnailUrl": "/api/thumbnails/artemis-ii-launch-001.jpg",
"uploadTimestamp": "2023-10-27T08:00:00Z"
},
{
"id": "lunar-rover-cam-005",
"name": "Lunar Rover - Sol 15",
"status": "processing_ai",
"progress": 75,
"videoUrl": "/api/videos/lunar-rover-cam-005.mp4",
"thumbnailUrl": "/api/thumbnails/lunar-rover-cam-005.jpg",
"uploadTimestamp": "2023-10-26T15:30:00Z"
},
{
"id": "arriving-soon-vid",
"name": "Upcoming Mission Footage",
"status": "uploading",
"progress": 30,
"videoUrl": null,
"thumbnailUrl": null,
"uploadTimestamp": "2023-10-27T11:00:00Z"
}
]
```
- **Analysis Data for `selectedVideo` (Example):
```json
{
"id": "artemis-ii-launch-001",
"analysisTimestamp": "2023-10-27T09:00:00Z",
"frames": [
{
"timestamp": "00:00:05.100",
"detections": [
{"box": [300, 400, 450, 550], "label": "Rocket Body", "confidence": 0.99}
]
},
{
"timestamp": "00:00:08.500",
"detections": [
{"box": [350, 450, 500, 600], "label": "Flame Vent", "confidence": 0.95}
]
},
{
"timestamp": "00:00:10.000",
"detections": [] // No specific objects detected in this frame
}
],
"keyMoments": [
{"timestamp": "00:00:08.500", "description": "Ignition Sequence Start", "type": "event", "confirmed": true},
{"timestamp": "00:00:12.250", "description": "Liftoff", "type": "event", "confirmed": true}
]
}
```
- **Stream Metrics (Example for `streamData` state):
```json
{
"bandwidthMbps": 258.7,
"latencyMs": 52,
"packetLossPercent": 0.12,
"timestamp": "2023-10-27T10:45:05Z"
}
```
## 10. DEPLOYMENT NOTES
- **Build Command:** `npm run build` or `yarn build` (using Vite).
- **Environment Variables:**
- `VITE_API_BASE_URL`: URL for the backend API (for AI processing, data retrieval).
- `VITE_AUTH_PROVIDER_URL`: URL for authentication service.
- `NODE_ENV`: Set to `production` for optimized builds.
- **Performance Optimizations:**
- Code splitting with Vite.
- Lazy loading components (`React.lazy` and `Suspense`).
- Image optimization for thumbnails.
- Efficient state management to prevent unnecessary re-renders.
- Use `useMemo` and `useCallback` where appropriate.
- **Containerization:** Create a `Dockerfile` for the React application (e.g., using Nginx to serve static files).
- **CI/CD:** Set up a pipeline (e.g., GitHub Actions) for automated testing, building, and deployment to Vercel/Netlify or a cloud provider.
- **Video Handling:** For large file uploads and high-bandwidth streaming, consider a dedicated backend service or cloud storage solution (e.g., AWS S3, Google Cloud Storage) with appropriate CDN integration. WebRTC might be needed for low-latency live streaming down the line.
- **AI Backend:** The AI processing will likely require a separate, scalable backend service. The frontend will communicate with this backend via REST APIs or GraphQL. Ensure the API design is robust and handles asynchronous tasks effectively.