Compare commits

..

5 Commits

Author SHA1 Message Date
copilot-swe-agent[bot]
23da976494 Remove translation changes from non-English locale files
Only en.json should have the new translation keys. Translators will add translations to other language files later.

Co-authored-by: tankerkiller125 <3457368+tankerkiller125@users.noreply.github.com>
2025-12-27 22:29:18 +00:00
copilot-swe-agent[bot]
aa48c958d7 Improve error handling and comments in WipeInventory method
Co-authored-by: katosdev <7927609+katosdev@users.noreply.github.com>
2025-12-27 15:03:43 +00:00
copilot-swe-agent[bot]
2bd6d0a9e5 Add Wipe Inventory feature with backend and frontend implementation
Co-authored-by: katosdev <7927609+katosdev@users.noreply.github.com>
2025-12-27 14:52:04 +00:00
copilot-swe-agent[bot]
88275620f2 Initial plan for Wipe Inventory feature
Co-authored-by: katosdev <7927609+katosdev@users.noreply.github.com>
2025-12-27 14:47:42 +00:00
copilot-swe-agent[bot]
5a058250e6 Initial plan 2025-12-27 14:43:31 +00:00
17 changed files with 105 additions and 1282 deletions

View File

@@ -1,259 +0,0 @@
# HomeBox Upgrade Testing Workflow
This document describes the automated upgrade testing workflow for HomeBox.
## Overview
The upgrade test workflow is designed to ensure data integrity and functionality when upgrading HomeBox from one version to another. It automatically:
1. Deploys a stable version of HomeBox
2. Creates test data (users, items, locations, labels, notifiers, attachments)
3. Upgrades to the latest version from the main branch
4. Verifies all data and functionality remain intact
## Workflow File
**Location**: `.github/workflows/upgrade-test.yaml`
## Trigger Conditions
The workflow runs:
- **Daily**: Automatically at 2 AM UTC (via cron schedule)
- **Manual**: Can be triggered manually via GitHub Actions UI
- **On Push**: When changes are made to the workflow files or test scripts
## Test Scenarios
### 1. Environment Setup
- Pulls the latest stable HomeBox Docker image from GHCR
- Starts the application with test configuration
- Ensures the service is healthy and ready
### 2. Data Creation
The workflow creates comprehensive test data using the `create-test-data.sh` script:
#### Users and Groups
- **Group 1**: 5 users (user1@homebox.test through user5@homebox.test)
- **Group 2**: 2 users (user6@homebox.test and user7@homebox.test)
- All users have password: `TestPassword123!`
#### Locations
- **Group 1**: Living Room, Garage
- **Group 2**: Home Office
#### Labels
- **Group 1**: Electronics, Important
- **Group 2**: Work Equipment
#### Items
- **Group 1**: 5 items (Laptop Computer, Power Drill, TV Remote, Tool Box, Coffee Maker)
- **Group 2**: 2 items (Monitor, Keyboard)
#### Attachments
- Multiple attachments added to various items (receipts, manuals, warranties)
#### Notifiers
- **Group 1**: Test notifier named "TESTING"
### 3. Upgrade Process
1. Stops the stable version container
2. Builds a fresh image from the current main branch
3. Copies the database to a new location
4. Starts the new version with the existing data
### 4. Verification Tests
The Playwright test suite (`upgrade-verification.spec.ts`) verifies:
-**User Authentication**: All 7 users can log in with their credentials
-**Data Persistence**: All items, locations, and labels are present
-**Attachments**: File attachments are correctly associated with items
-**Notifiers**: The "TESTING" notifier is still configured
-**UI Functionality**: Version display, theme switching work correctly
-**Data Isolation**: Groups can only see their own data
## Test Data File
The setup script generates a JSON file at `/tmp/test-users.json` containing:
```json
{
"users": [
{
"email": "user1@homebox.test",
"password": "TestPassword123!",
"token": "...",
"group": "1"
},
...
],
"locations": {
"group1": ["location-id-1", "location-id-2"],
"group2": ["location-id-3"]
},
"labels": {...},
"items": {...},
"notifiers": {...}
}
```
This file is used by the Playwright tests to verify data integrity.
## Scripts
### create-test-data.sh
**Location**: `.github/scripts/upgrade-test/create-test-data.sh`
**Purpose**: Creates all test data via the HomeBox REST API
**Environment Variables**:
- `HOMEBOX_URL`: Base URL of the HomeBox instance (default: http://localhost:7745)
- `TEST_DATA_FILE`: Path to output JSON file (default: /tmp/test-users.json)
**Requirements**:
- `curl`: For API calls
- `jq`: For JSON processing
**Usage**:
```bash
export HOMEBOX_URL=http://localhost:7745
./.github/scripts/upgrade-test/create-test-data.sh
```
## Running Tests Locally
To run the upgrade tests locally:
### Prerequisites
```bash
# Install dependencies
sudo apt-get install -y jq curl docker.io
# Install pnpm and Playwright
cd frontend
pnpm install
pnpm exec playwright install --with-deps chromium
```
### Run the test
```bash
# Start stable version
docker run -d \
--name homebox-test \
-p 7745:7745 \
-e HBOX_OPTIONS_ALLOW_REGISTRATION=true \
-v /tmp/homebox-data:/data \
ghcr.io/sysadminsmedia/homebox:latest
# Wait for startup
sleep 10
# Create test data
export HOMEBOX_URL=http://localhost:7745
./.github/scripts/upgrade-test/create-test-data.sh
# Stop container
docker stop homebox-test
docker rm homebox-test
# Build new version
docker build -t homebox:test .
# Start new version with existing data
docker run -d \
--name homebox-test \
-p 7745:7745 \
-e HBOX_OPTIONS_ALLOW_REGISTRATION=true \
-v /tmp/homebox-data:/data \
homebox:test
# Wait for startup
sleep 10
# Run verification tests
cd frontend
TEST_DATA_FILE=/tmp/test-users.json \
E2E_BASE_URL=http://localhost:7745 \
pnpm exec playwright test \
--project=chromium \
test/upgrade/upgrade-verification.spec.ts
# Cleanup
docker stop homebox-test
docker rm homebox-test
```
## Artifacts
The workflow produces several artifacts:
1. **playwright-report-upgrade-test**: HTML report of test results
2. **playwright-traces**: Detailed traces for debugging failures
3. **Docker logs**: Collected on failure for troubleshooting
## Failure Scenarios
The workflow will fail if:
- The stable version fails to start
- Test data creation fails
- The new version fails to start with existing data
- Any verification test fails
- Database migrations fail
## Troubleshooting
### Test Data Creation Fails
Check the Docker logs:
```bash
docker logs homebox-old
```
Verify the API is accessible:
```bash
curl http://localhost:7745/api/v1/status
```
### Verification Tests Fail
1. Download the Playwright report from GitHub Actions artifacts
2. Review the HTML report for detailed failure information
3. Check traces for visual debugging
### Database Issues
If migrations fail:
```bash
# Check database file
ls -lh /tmp/homebox-data-new/homebox.db
# Check Docker logs for migration errors
docker logs homebox-new
```
## Future Enhancements
Potential improvements:
- [ ] Test multiple upgrade paths (e.g., v0.10 → v0.11 → v0.12)
- [ ] Test with PostgreSQL backend in addition to SQLite
- [ ] Add performance benchmarks
- [ ] Test with larger datasets
- [ ] Add API-level verification in addition to UI tests
- [ ] Test backup and restore functionality
## Related Files
- `.github/workflows/upgrade-test.yaml` - Main workflow definition
- `.github/scripts/upgrade-test/create-test-data.sh` - Data generation script
- `frontend/test/upgrade/upgrade-verification.spec.ts` - Playwright verification tests
- `.github/workflows/e2e-partial.yaml` - Standard E2E test workflow (for reference)
## Support
For issues or questions about this workflow:
1. Check the GitHub Actions run logs
2. Review this documentation
3. Open an issue in the repository

View File

@@ -1,413 +0,0 @@
#!/bin/bash
# Script to create test data in HomeBox for upgrade testing
# This script creates users, items, attachments, notifiers, locations, and labels
set -e
HOMEBOX_URL="${HOMEBOX_URL:-http://localhost:7745}"
API_URL="${HOMEBOX_URL}/api/v1"
TEST_DATA_FILE="${TEST_DATA_FILE:-/tmp/test-users.json}"
echo "Creating test data in HomeBox at $HOMEBOX_URL"
# Function to make API calls with error handling
api_call() {
local method=$1
local endpoint=$2
local data=$3
local token=$4
if [ -n "$token" ]; then
if [ -n "$data" ]; then
curl -s -X "$method" \
-H "Authorization: Bearer $token" \
-H "Content-Type: application/json" \
-d "$data" \
"$API_URL$endpoint"
else
curl -s -X "$method" \
-H "Authorization: Bearer $token" \
-H "Content-Type: application/json" \
"$API_URL$endpoint"
fi
else
if [ -n "$data" ]; then
curl -s -X "$method" \
-H "Content-Type: application/json" \
-d "$data" \
"$API_URL$endpoint"
else
curl -s -X "$method" \
-H "Content-Type: application/json" \
"$API_URL$endpoint"
fi
fi
}
# Function to register a user and get token
register_user() {
local email=$1
local name=$2
local password=$3
local group_token=$4
echo "Registering user: $email"
local payload="{\"email\":\"$email\",\"name\":\"$name\",\"password\":\"$password\""
if [ -n "$group_token" ]; then
payload="$payload,\"groupToken\":\"$group_token\""
fi
payload="$payload}"
local response=$(curl -s -X POST \
-H "Content-Type: application/json" \
-d "$payload" \
"$API_URL/users/register")
echo "$response"
}
# Function to login and get token
login_user() {
local email=$1
local password=$2
echo "Logging in user: $email" >&2
local response=$(curl -s -X POST \
-H "Content-Type: application/json" \
-d "{\"username\":\"$email\",\"password\":\"$password\"}" \
"$API_URL/users/login")
echo "$response" | jq -r '.token // empty'
}
# Function to create an item
create_item() {
local token=$1
local name=$2
local description=$3
local location_id=$4
echo "Creating item: $name" >&2
local payload="{\"name\":\"$name\",\"description\":\"$description\""
if [ -n "$location_id" ]; then
payload="$payload,\"locationId\":\"$location_id\""
fi
payload="$payload}"
local response=$(curl -s -X POST \
-H "Authorization: Bearer $token" \
-H "Content-Type: application/json" \
-d "$payload" \
"$API_URL/items")
echo "$response"
}
# Function to create a location
create_location() {
local token=$1
local name=$2
local description=$3
echo "Creating location: $name" >&2
local response=$(curl -s -X POST \
-H "Authorization: Bearer $token" \
-H "Content-Type: application/json" \
-d "{\"name\":\"$name\",\"description\":\"$description\"}" \
"$API_URL/locations")
echo "$response"
}
# Function to create a label
create_label() {
local token=$1
local name=$2
local description=$3
echo "Creating label: $name" >&2
local response=$(curl -s -X POST \
-H "Authorization: Bearer $token" \
-H "Content-Type: application/json" \
-d "{\"name\":\"$name\",\"description\":\"$description\"}" \
"$API_URL/labels")
echo "$response"
}
# Function to create a notifier
create_notifier() {
local token=$1
local name=$2
local url=$3
echo "Creating notifier: $name" >&2
local response=$(curl -s -X POST \
-H "Authorization: Bearer $token" \
-H "Content-Type: application/json" \
-d "{\"name\":\"$name\",\"url\":\"$url\",\"isActive\":true}" \
"$API_URL/groups/notifiers")
echo "$response"
}
# Function to attach a file to an item (creates a dummy attachment)
attach_file_to_item() {
local token=$1
local item_id=$2
local filename=$3
echo "Creating attachment for item: $item_id" >&2
# Create a temporary file with some content
local temp_file=$(mktemp)
echo "This is a test attachment for $filename" > "$temp_file"
local response=$(curl -s -X POST \
-H "Authorization: Bearer $token" \
-F "file=@$temp_file" \
-F "type=attachment" \
-F "name=$filename" \
"$API_URL/items/$item_id/attachments")
rm -f "$temp_file"
echo "$response"
}
# Initialize test data storage
echo "{\"users\":[]}" > "$TEST_DATA_FILE"
echo "=== Step 1: Create first group with 5 users ==="
# Register first user (creates a new group)
user1_response=$(register_user "user1@homebox.test" "User One" "TestPassword123!")
user1_token=$(echo "$user1_response" | jq -r '.token // empty')
group_token=$(echo "$user1_response" | jq -r '.group.inviteToken // empty')
if [ -z "$user1_token" ]; then
echo "Failed to register first user"
echo "Response: $user1_response"
exit 1
fi
echo "First user registered with token. Group token: $group_token"
# Store user1 data
jq --arg email "user1@homebox.test" \
--arg password "TestPassword123!" \
--arg token "$user1_token" \
--arg group "1" \
'.users += [{"email":$email,"password":$password,"token":$token,"group":$group}]' \
"$TEST_DATA_FILE" > "$TEST_DATA_FILE.tmp" && mv "$TEST_DATA_FILE.tmp" "$TEST_DATA_FILE"
# Register 4 more users in the same group
for i in {2..5}; do
echo "Registering user$i in group 1..."
user_response=$(register_user "user${i}@homebox.test" "User $i" "TestPassword123!" "$group_token")
user_token=$(echo "$user_response" | jq -r '.token // empty')
if [ -z "$user_token" ]; then
echo "Failed to register user$i"
echo "Response: $user_response"
else
echo "user$i registered successfully"
# Store user data
jq --arg email "user${i}@homebox.test" \
--arg password "TestPassword123!" \
--arg token "$user_token" \
--arg group "1" \
'.users += [{"email":$email,"password":$password,"token":$token,"group":$group}]' \
"$TEST_DATA_FILE" > "$TEST_DATA_FILE.tmp" && mv "$TEST_DATA_FILE.tmp" "$TEST_DATA_FILE"
fi
done
echo "=== Step 2: Create second group with 2 users ==="
# Register first user of second group
user6_response=$(register_user "user6@homebox.test" "User Six" "TestPassword123!")
user6_token=$(echo "$user6_response" | jq -r '.token // empty')
group2_token=$(echo "$user6_response" | jq -r '.group.inviteToken // empty')
if [ -z "$user6_token" ]; then
echo "Failed to register user6"
echo "Response: $user6_response"
exit 1
fi
echo "user6 registered with token. Group 2 token: $group2_token"
# Store user6 data
jq --arg email "user6@homebox.test" \
--arg password "TestPassword123!" \
--arg token "$user6_token" \
--arg group "2" \
'.users += [{"email":$email,"password":$password,"token":$token,"group":$group}]' \
"$TEST_DATA_FILE" > "$TEST_DATA_FILE.tmp" && mv "$TEST_DATA_FILE.tmp" "$TEST_DATA_FILE"
# Register second user in group 2
user7_response=$(register_user "user7@homebox.test" "User Seven" "TestPassword123!" "$group2_token")
user7_token=$(echo "$user7_response" | jq -r '.token // empty')
if [ -z "$user7_token" ]; then
echo "Failed to register user7"
echo "Response: $user7_response"
else
echo "user7 registered successfully"
# Store user7 data
jq --arg email "user7@homebox.test" \
--arg password "TestPassword123!" \
--arg token "$user7_token" \
--arg group "2" \
'.users += [{"email":$email,"password":$password,"token":$token,"group":$group}]' \
"$TEST_DATA_FILE" > "$TEST_DATA_FILE.tmp" && mv "$TEST_DATA_FILE.tmp" "$TEST_DATA_FILE"
fi
echo "=== Step 3: Create locations for each group ==="
# Create locations for group 1 (using user1's token)
location1=$(create_location "$user1_token" "Living Room" "Main living area")
location1_id=$(echo "$location1" | jq -r '.id // empty')
echo "Created location: Living Room (ID: $location1_id)"
location2=$(create_location "$user1_token" "Garage" "Storage and tools")
location2_id=$(echo "$location2" | jq -r '.id // empty')
echo "Created location: Garage (ID: $location2_id)"
# Create location for group 2 (using user6's token)
location3=$(create_location "$user6_token" "Home Office" "Work from home space")
location3_id=$(echo "$location3" | jq -r '.id // empty')
echo "Created location: Home Office (ID: $location3_id)"
# Store locations
jq --arg loc1 "$location1_id" \
--arg loc2 "$location2_id" \
--arg loc3 "$location3_id" \
'.locations = {"group1":[$loc1,$loc2],"group2":[$loc3]}' \
"$TEST_DATA_FILE" > "$TEST_DATA_FILE.tmp" && mv "$TEST_DATA_FILE.tmp" "$TEST_DATA_FILE"
echo "=== Step 4: Create labels for each group ==="
# Create labels for group 1
label1=$(create_label "$user1_token" "Electronics" "Electronic devices")
label1_id=$(echo "$label1" | jq -r '.id // empty')
echo "Created label: Electronics (ID: $label1_id)"
label2=$(create_label "$user1_token" "Important" "High priority items")
label2_id=$(echo "$label2" | jq -r '.id // empty')
echo "Created label: Important (ID: $label2_id)"
# Create label for group 2
label3=$(create_label "$user6_token" "Work Equipment" "Items for work")
label3_id=$(echo "$label3" | jq -r '.id // empty')
echo "Created label: Work Equipment (ID: $label3_id)"
# Store labels
jq --arg lab1 "$label1_id" \
--arg lab2 "$label2_id" \
--arg lab3 "$label3_id" \
'.labels = {"group1":[$lab1,$lab2],"group2":[$lab3]}' \
"$TEST_DATA_FILE" > "$TEST_DATA_FILE.tmp" && mv "$TEST_DATA_FILE.tmp" "$TEST_DATA_FILE"
echo "=== Step 5: Create test notifier ==="
# Create notifier for group 1
notifier1=$(create_notifier "$user1_token" "TESTING" "https://example.com/webhook")
notifier1_id=$(echo "$notifier1" | jq -r '.id // empty')
echo "Created notifier: TESTING (ID: $notifier1_id)"
# Store notifier
jq --arg not1 "$notifier1_id" \
'.notifiers = {"group1":[$not1]}' \
"$TEST_DATA_FILE" > "$TEST_DATA_FILE.tmp" && mv "$TEST_DATA_FILE.tmp" "$TEST_DATA_FILE"
echo "=== Step 6: Create items for all users ==="
# Create items for users in group 1
declare -A user_tokens
user_tokens[1]=$user1_token
user_tokens[2]=$(echo "$user1_token") # Users in same group share data, but we'll use user1 token
user_tokens[3]=$(echo "$user1_token")
user_tokens[4]=$(echo "$user1_token")
user_tokens[5]=$(echo "$user1_token")
# Items for group 1 users
echo "Creating items for group 1..."
item1=$(create_item "$user1_token" "Laptop Computer" "Dell XPS 15 for work" "$location1_id")
item1_id=$(echo "$item1" | jq -r '.id // empty')
echo "Created item: Laptop Computer (ID: $item1_id)"
item2=$(create_item "$user1_token" "Power Drill" "DeWalt 20V cordless drill" "$location2_id")
item2_id=$(echo "$item2" | jq -r '.id // empty')
echo "Created item: Power Drill (ID: $item2_id)"
item3=$(create_item "$user1_token" "TV Remote" "Samsung TV remote control" "$location1_id")
item3_id=$(echo "$item3" | jq -r '.id // empty')
echo "Created item: TV Remote (ID: $item3_id)"
item4=$(create_item "$user1_token" "Tool Box" "Red metal tool box with tools" "$location2_id")
item4_id=$(echo "$item4" | jq -r '.id // empty')
echo "Created item: Tool Box (ID: $item4_id)"
item5=$(create_item "$user1_token" "Coffee Maker" "Breville espresso machine" "$location1_id")
item5_id=$(echo "$item5" | jq -r '.id // empty')
echo "Created item: Coffee Maker (ID: $item5_id)"
# Items for group 2 users
echo "Creating items for group 2..."
item6=$(create_item "$user6_token" "Monitor" "27 inch 4K monitor" "$location3_id")
item6_id=$(echo "$item6" | jq -r '.id // empty')
echo "Created item: Monitor (ID: $item6_id)"
item7=$(create_item "$user6_token" "Keyboard" "Mechanical keyboard" "$location3_id")
item7_id=$(echo "$item7" | jq -r '.id // empty')
echo "Created item: Keyboard (ID: $item7_id)"
# Store items
jq --argjson group1_items "[\"$item1_id\",\"$item2_id\",\"$item3_id\",\"$item4_id\",\"$item5_id\"]" \
--argjson group2_items "[\"$item6_id\",\"$item7_id\"]" \
'.items = {"group1":$group1_items,"group2":$group2_items}' \
"$TEST_DATA_FILE" > "$TEST_DATA_FILE.tmp" && mv "$TEST_DATA_FILE.tmp" "$TEST_DATA_FILE"
echo "=== Step 7: Add attachments to items ==="
# Add attachments for group 1 items
echo "Adding attachments to group 1 items..."
attach_file_to_item "$user1_token" "$item1_id" "laptop-receipt.pdf"
attach_file_to_item "$user1_token" "$item1_id" "laptop-warranty.pdf"
attach_file_to_item "$user1_token" "$item2_id" "drill-manual.pdf"
attach_file_to_item "$user1_token" "$item3_id" "remote-guide.pdf"
attach_file_to_item "$user1_token" "$item4_id" "toolbox-inventory.txt"
# Add attachments for group 2 items
echo "Adding attachments to group 2 items..."
attach_file_to_item "$user6_token" "$item6_id" "monitor-receipt.pdf"
attach_file_to_item "$user6_token" "$item7_id" "keyboard-manual.pdf"
echo "=== Test Data Creation Complete ==="
echo "Test data file saved to: $TEST_DATA_FILE"
echo "Summary:"
echo " - Users created: 7 (5 in group 1, 2 in group 2)"
echo " - Locations created: 3"
echo " - Labels created: 3"
echo " - Notifiers created: 1"
echo " - Items created: 7"
echo " - Attachments created: 7"
# Display the test data file for verification
echo ""
echo "Test data:"
cat "$TEST_DATA_FILE" | jq '.'
exit 0

View File

@@ -1,177 +0,0 @@
#name: HomeBox Upgrade Test
# on:
# schedule:
# Run daily at 2 AM UTC
# - cron: '0 2 * * *'
# workflow_dispatch: # Allow manual trigger
# push:
# branches:
# - main
# paths:
# - '.github/workflows/upgrade-test.yaml'
# - '.github/scripts/upgrade-test/**'
jobs:
upgrade-test:
name: Test Upgrade Path
runs-on: ubuntu-latest
timeout-minutes: 60
permissions:
contents: read # Read repository contents
packages: read # Pull Docker images from GHCR
steps:
- name: Checkout repository
uses: actions/checkout@v4
with:
fetch-depth: 0
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
- name: Set up Node.js
uses: actions/setup-node@v4
with:
node-version: lts/*
- name: Install pnpm
uses: pnpm/action-setup@v3.0.0
with:
version: 9.12.2
- name: Install Playwright
run: |
cd frontend
pnpm install
pnpm exec playwright install --with-deps chromium
- name: Create test data directory
run: |
mkdir -p /tmp/homebox-data-old
mkdir -p /tmp/homebox-data-new
chmod -R 777 /tmp/homebox-data-old
chmod -R 777 /tmp/homebox-data-new
# Step 1: Pull and deploy latest stable version
- name: Pull latest stable HomeBox image
run: |
docker pull ghcr.io/sysadminsmedia/homebox:latest
- name: Start HomeBox (stable version)
run: |
docker run -d \
--name homebox-old \
--restart unless-stopped \
-p 7745:7745 \
-e HBOX_LOG_LEVEL=debug \
-e HBOX_OPTIONS_ALLOW_REGISTRATION=true \
-e TZ=UTC \
-v /tmp/homebox-data-old:/data \
ghcr.io/sysadminsmedia/homebox:latest
# Wait for the service to be ready
timeout 60 bash -c 'until curl -f http://localhost:7745/api/v1/status; do sleep 2; done'
echo "HomeBox stable version is ready"
# Step 2: Create test data
- name: Create test data
run: |
chmod +x .github/scripts/upgrade-test/create-test-data.sh
.github/scripts/upgrade-test/create-test-data.sh
env:
HOMEBOX_URL: http://localhost:7745
- name: Verify initial data creation
run: |
echo "Verifying test data was created..."
# Check if database file exists and has content
if [ -f /tmp/homebox-data-old/homebox.db ]; then
ls -lh /tmp/homebox-data-old/homebox.db
echo "Database file exists"
else
echo "Database file not found!"
exit 1
fi
- name: Stop old HomeBox instance
run: |
docker stop homebox-old
docker rm homebox-old
# Step 3: Build latest version from main branch
- name: Build HomeBox from main branch
run: |
docker build \
--build-arg VERSION=main \
--build-arg COMMIT=${{ github.sha }} \
--build-arg BUILD_TIME="$(date -u +"%Y-%m-%dT%H:%M:%SZ")" \
-t homebox:test \
-f Dockerfile \
.
# Step 4: Copy data and start new version
- name: Copy data to new location
run: |
cp -r /tmp/homebox-data-old/* /tmp/homebox-data-new/
chmod -R 777 /tmp/homebox-data-new
- name: Start HomeBox (new version)
run: |
docker run -d \
--name homebox-new \
--restart unless-stopped \
-p 7745:7745 \
-e HBOX_LOG_LEVEL=debug \
-e HBOX_OPTIONS_ALLOW_REGISTRATION=true \
-e TZ=UTC \
-v /tmp/homebox-data-new:/data \
homebox:test
# Wait for the service to be ready
timeout 60 bash -c 'until curl -f http://localhost:7745/api/v1/status; do sleep 2; done'
echo "HomeBox new version is ready"
# Step 5: Run verification tests with Playwright
- name: Run verification tests
run: |
cd frontend
TEST_DATA_FILE=/tmp/test-users.json \
E2E_BASE_URL=http://localhost:7745 \
pnpm exec playwright test \
-c ./test/playwright.config.ts \
--project=chromium \
test/upgrade/upgrade-verification.spec.ts
env:
HOMEBOX_URL: http://localhost:7745
- name: Upload Playwright report
uses: actions/upload-artifact@v4
if: always()
with:
name: playwright-report-upgrade-test
path: frontend/playwright-report/
retention-days: 30
- name: Upload test traces
uses: actions/upload-artifact@v4
if: always()
with:
name: playwright-traces
path: frontend/test-results/
retention-days: 7
- name: Collect logs on failure
if: failure()
run: |
echo "=== Docker logs for new version ==="
docker logs homebox-new || true
echo "=== Database content ==="
ls -la /tmp/homebox-data-new/ || true
- name: Cleanup
if: always()
run: |
docker stop homebox-new || true
docker rm homebox-new || true
docker rmi homebox:test || true

View File

@@ -8,7 +8,7 @@ github.com/pmezard/go-difflib v1.0.1-0.20181226105442-5d4384ee4fb2 h1:Jamvg5psRI
github.com/pmezard/go-difflib v1.0.1-0.20181226105442-5d4384ee4fb2/go.mod h1:iKH77koFhYxTK1pcRnkKkqfTogsbg7gZNVY4sRDYZ/4=
github.com/stretchr/testify v1.11.1 h1:7s2iGBzp5EwR7/aIZr8ao5+dra3wiQyKjjFuvgVKu7U=
github.com/stretchr/testify v1.11.1/go.mod h1:wZwfW3scLgRK+23gO65QZefKpKQRnfz6sD981Nm4B6U=
github.com/sysadminsmedia/homebox/backend v0.0.0-20251212183312-2d1d3d927bfd h1:QULUJSgHc4rSlTjb2qYT6FIgwDWFCqEpnYqc/ltsrkk=
github.com/sysadminsmedia/homebox/backend v0.0.0-20251212183312-2d1d3d927bfd/go.mod h1:jB+tPmHtPDM1VnAjah0gvcRfP/s7c+rtQwpA8cvZD/U=
github.com/sysadminsmedia/homebox/backend v0.0.0-20251226222718-473027c1aea3 h1:O7Sy/SfxuqxaeR4kUK/sRhHPeKrmraszRyK7ROJZ7Qw=
github.com/sysadminsmedia/homebox/backend v0.0.0-20251226222718-473027c1aea3/go.mod h1:9zHHw5TNttw5Kn4Wks+SxwXmJPz6PgGNbnB4BtF1Z4c=
gopkg.in/yaml.v3 v3.0.1 h1:fxVm/GzAzEWqLHuvctI91KS9hhNmmWOoWu0XTYJS7CA=
gopkg.in/yaml.v3 v3.0.1/go.mod h1:K4uyk7z7BCEPqu6E+C64Yfv1cQ7kz7rIZviUmN+EgEM=

View File

@@ -94,3 +94,16 @@ func (ctrl *V1Controller) HandleSetPrimaryPhotos() errchain.HandlerFunc {
func (ctrl *V1Controller) HandleCreateMissingThumbnails() errchain.HandlerFunc {
return actionHandlerFactory("create missing thumbnails", ctrl.repo.Attachments.CreateMissingThumbnails)
}
// HandleWipeInventory godoc
//
// @Summary Wipe Inventory
// @Description Deletes all items in the inventory
// @Tags Actions
// @Produce json
// @Success 200 {object} ActionAmountResult
// @Router /v1/actions/wipe-inventory [Post]
// @Security Bearer
func (ctrl *V1Controller) HandleWipeInventory() errchain.HandlerFunc {
return actionHandlerFactory("wipe inventory", ctrl.repo.Items.WipeInventory)
}

View File

@@ -108,7 +108,7 @@ func run(cfg *config.Config) error {
return err
}
if strings.ToLower(cfg.Database.Driver) == config.DriverPostgres {
if strings.ToLower(cfg.Database.Driver) == "postgres" {
if !validatePostgresSSLMode(cfg.Database.SslMode) {
log.Error().Str("sslmode", cfg.Database.SslMode).Msg("invalid sslmode")
return fmt.Errorf("invalid sslmode: %s", cfg.Database.SslMode)

View File

@@ -108,6 +108,7 @@ func (a *app) mountRoutes(r *chi.Mux, chain *errchain.ErrChain, repos *repo.AllR
r.Post("/actions/ensure-import-refs", chain.ToHandlerFunc(v1Ctrl.HandleEnsureImportRefs(), userMW...))
r.Post("/actions/set-primary-photos", chain.ToHandlerFunc(v1Ctrl.HandleSetPrimaryPhotos(), userMW...))
r.Post("/actions/create-missing-thumbnails", chain.ToHandlerFunc(v1Ctrl.HandleCreateMissingThumbnails(), userMW...))
r.Post("/actions/wipe-inventory", chain.ToHandlerFunc(v1Ctrl.HandleWipeInventory(), userMW...))
r.Get("/locations", chain.ToHandlerFunc(v1Ctrl.HandleLocationGetAll(), userMW...))
r.Post("/locations", chain.ToHandlerFunc(v1Ctrl.HandleLocationCreate(), userMW...))

View File

@@ -41,7 +41,7 @@ func setupStorageDir(cfg *config.Config) error {
func setupDatabaseURL(cfg *config.Config) (string, error) {
databaseURL := ""
switch strings.ToLower(cfg.Database.Driver) {
case config.DriverSqlite3:
case "sqlite3":
databaseURL = cfg.Database.SqlitePath
dbFilePath := strings.Split(cfg.Database.SqlitePath, "?")[0]
dbDir := filepath.Dir(dbFilePath)
@@ -49,7 +49,7 @@ func setupDatabaseURL(cfg *config.Config) (string, error) {
log.Error().Err(err).Str("path", dbDir).Msg("failed to create SQLite database directory")
return "", fmt.Errorf("failed to create SQLite database directory: %w", err)
}
case config.DriverPostgres:
case "postgres":
databaseURL = fmt.Sprintf("host=%s port=%s dbname=%s sslmode=%s", cfg.Database.Host, cfg.Database.Port, cfg.Database.Database, cfg.Database.SslMode)
if cfg.Database.Username != "" {
databaseURL += fmt.Sprintf(" user=%s", cfg.Database.Username)

View File

@@ -4,7 +4,6 @@ import (
"entgo.io/ent/dialect/sql"
"github.com/sysadminsmedia/homebox/backend/internal/data/ent/item"
"github.com/sysadminsmedia/homebox/backend/internal/data/ent/predicate"
conf "github.com/sysadminsmedia/homebox/backend/internal/sys/config"
"github.com/sysadminsmedia/homebox/backend/pkgs/textutils"
)
@@ -25,7 +24,7 @@ func AccentInsensitiveContains(field string, searchValue string) predicate.Item
dialect := s.Dialect()
switch dialect {
case conf.DriverSqlite3:
case "sqlite3":
// For SQLite, we'll create a custom normalization function using REPLACE
// to handle common accented characters
normalizeFunc := buildSQLiteNormalizeExpression(s.C(field))
@@ -33,7 +32,7 @@ func AccentInsensitiveContains(field string, searchValue string) predicate.Item
"LOWER("+normalizeFunc+") LIKE ?",
"%"+normalizedSearch+"%",
))
case conf.DriverPostgres:
case "postgres":
// For PostgreSQL, use REPLACE-based normalization to avoid unaccent dependency
normalizeFunc := buildGenericNormalizeExpression(s.C(field))
// Use sql.P() for proper PostgreSQL parameter binding ($1, $2, etc.)

View File

@@ -6,7 +6,6 @@ import (
"fmt"
"github.com/rs/zerolog/log"
"github.com/sysadminsmedia/homebox/backend/internal/sys/config"
)
//go:embed all:postgres
@@ -22,9 +21,9 @@ var sqliteFiles embed.FS
// embedded file system containing the migration files for the specified dialect.
func Migrations(dialect string) (embed.FS, error) {
switch dialect {
case config.DriverPostgres:
case "postgres":
return postgresFiles, nil
case config.DriverSqlite3:
case "sqlite3":
return sqliteFiles, nil
default:
log.Error().Str("dialect", dialect).Msg("unknown sql dialect")

View File

@@ -809,6 +809,51 @@ func (e *ItemsRepository) DeleteByGroup(ctx context.Context, gid, id uuid.UUID)
return err
}
func (e *ItemsRepository) WipeInventory(ctx context.Context, gid uuid.UUID) (int, error) {
// Get all items for the group
items, err := e.db.Item.Query().
Where(item.HasGroupWith(group.ID(gid))).
WithAttachments().
All(ctx)
if err != nil {
return 0, err
}
deleted := 0
// Delete each item with its attachments
// Note: We manually delete attachments and items instead of calling DeleteByGroup
// to continue processing remaining items even if some deletions fail
for _, itm := range items {
// Delete all attachments first
for _, att := range itm.Edges.Attachments {
err := e.attachments.Delete(ctx, gid, itm.ID, att.ID)
if err != nil {
log.Err(err).Str("attachment_id", att.ID.String()).Msg("failed to delete attachment during wipe inventory")
// Continue with other attachments even if one fails
}
}
// Delete the item
_, err = e.db.Item.
Delete().
Where(
item.ID(itm.ID),
item.HasGroupWith(group.ID(gid)),
).Exec(ctx)
if err != nil {
log.Err(err).Str("item_id", itm.ID.String()).Msg("failed to delete item during wipe inventory")
// Skip to next item without incrementing counter
continue
}
// Only increment counter if deletion succeeded
deleted++
}
e.publishMutationEvent(gid)
return deleted, nil
}
func (e *ItemsRepository) UpdateByGroup(ctx context.Context, gid uuid.UUID, data ItemUpdate) (ItemOut, error) {
q := e.db.Item.Update().Where(item.ID(data.ID), item.HasGroupWith(group.ID(gid))).
SetName(data.Name).

View File

@@ -1,8 +1,7 @@
package config
const (
DriverSqlite3 = "sqlite3"
DriverPostgres = "postgres"
DriverSqlite3 = "sqlite3"
)
type Storage struct {

View File

@@ -43,7 +43,6 @@ export default defineConfig({
nav: [
{ text: 'API Docs', link: '/en/api' },
{ text: 'Demo', link: 'https://demo.homebox.software' },
{ text: 'Blog', link: 'https://sysadminsjournal.com/tag/homebox/' }
],
sidebar: {

View File

@@ -31,4 +31,10 @@ export class ActionsAPI extends BaseAPI {
url: route("/actions/create-missing-thumbnails"),
});
}
wipeInventory() {
return this.http.post<void, ActionAmountResult>({
url: route("/actions/wipe-inventory"),
});
}
}

View File

@@ -735,6 +735,10 @@
"set_primary_photo_button": "Set Primary Photo",
"set_primary_photo_confirm": "Are you sure you want to set primary photos? This can take a while and cannot be undone.",
"set_primary_photo_sub": "In version v0.10.0 of Homebox, the primary image field was added to attachments of type photo. This action will set the primary image field to the first image in the attachments array in the database, if it is not already set. '<a class=\"link\" href=\"https://github.com/hay-kot/homebox/pull/576\">'See GitHub PR #576'</a>'",
"wipe_inventory": "Wipe Inventory",
"wipe_inventory_button": "Wipe Inventory",
"wipe_inventory_confirm": "Are you sure you want to wipe your entire inventory? This will delete all items and cannot be undone.",
"wipe_inventory_sub": "Permanently deletes all items in your inventory. This action is irreversible and will remove all item data including attachments and photos.",
"zero_datetimes": "Zero Item Date Times",
"zero_datetimes_button": "Zero Item Date Times",
"zero_datetimes_confirm": "Are you sure you want to reset all date and time values? This can take a while and cannot be undone.",
@@ -768,7 +772,9 @@
"failed_ensure_ids": "Failed to ensure asset IDs.",
"failed_ensure_import_refs": "Failed to ensure import refs.",
"failed_set_primary_photos": "Failed to set primary photos.",
"failed_zero_datetimes": "Failed to reset date and time values."
"failed_wipe_inventory": "Failed to wipe inventory.",
"failed_zero_datetimes": "Failed to reset date and time values.",
"wipe_inventory_success": "Successfully wiped inventory. { results } items deleted."
}
}
}

View File

@@ -90,6 +90,12 @@
<div v-html="DOMPurify.sanitize($t('tools.actions_set.create_missing_thumbnails_sub'))" />
<template #button> {{ $t("tools.actions_set.create_missing_thumbnails_button") }} </template>
</DetailAction>
<DetailAction @action="wipeInventory">
<template #title> {{ $t("tools.actions_set.wipe_inventory") }} </template>
<!-- eslint-disable-next-line vue/no-v-html -->
<div v-html="DOMPurify.sanitize($t('tools.actions_set.wipe_inventory_sub'))" />
<template #button> {{ $t("tools.actions_set.wipe_inventory_button") }} </template>
</DetailAction>
</div>
</BaseCard>
</BaseContainer>
@@ -220,6 +226,23 @@
toast.success(t("tools.toast.asset_success", { results: result.data.completed }));
}
async function wipeInventory() {
const { isCanceled } = await confirm.open(t("tools.actions_set.wipe_inventory_confirm"));
if (isCanceled) {
return;
}
const result = await api.actions.wipeInventory();
if (result.error) {
toast.error(t("tools.toast.failed_wipe_inventory"));
return;
}
toast.success(t("tools.toast.wipe_inventory_success", { results: result.data.completed }));
}
</script>
<style scoped></style>

View File

@@ -1,418 +0,0 @@
/**
* HomeBox Upgrade Verification Tests
*
* NOTE: These tests are ONLY meant to run in the upgrade-test workflow.
* They require test data to be pre-created by the create-test-data.sh script.
* These tests are stored in test/upgrade/ (not test/e2e/) to prevent them
* from running during normal E2E test runs.
*/
import { expect, test } from "@playwright/test";
import * as fs from "fs";
// Load test data created by the setup script
const testDataPath = process.env.TEST_DATA_FILE || "/tmp/test-users.json";
interface TestUser {
email: string;
password: string;
token: string;
group: string;
}
interface TestData {
users?: TestUser[];
locations?: Record<string, string[]>;
labels?: Record<string, string[]>;
items?: Record<string, string[]>;
notifiers?: Record<string, string[]>;
}
let testData: TestData = {};
test.beforeAll(() => {
if (fs.existsSync(testDataPath)) {
const rawData = fs.readFileSync(testDataPath, "utf-8");
testData = JSON.parse(rawData);
console.log("Loaded test data:", JSON.stringify(testData, null, 2));
} else {
console.error(`Test data file not found at ${testDataPath}`);
throw new Error("Test data file not found");
}
});
test.describe("HomeBox Upgrade Verification", () => {
test("verify all users can log in", async ({ page }) => {
// Test each user from the test data
for (const user of testData.users || []) {
await page.goto("/");
await expect(page).toHaveURL("/");
// Wait for login form to be ready
await page.waitForSelector("input[type='text']", { state: "visible" });
// Fill in login form
await page.fill("input[type='text']", user.email);
await page.fill("input[type='password']", user.password);
await page.click("button[type='submit']");
// Wait for navigation to home page
await expect(page).toHaveURL("/home", { timeout: 10000 });
console.log(`✓ User ${user.email} logged in successfully`);
// Navigate back to login for next user
await page.goto("/");
await page.waitForSelector("input[type='text']", { state: "visible" });
}
});
test("verify application version is displayed", async ({ page }) => {
// Login as first user
const firstUser = testData.users?.[0];
if (!firstUser) {
throw new Error("No users found in test data");
}
await page.goto("/");
await page.fill("input[type='text']", firstUser.email);
await page.fill("input[type='password']", firstUser.password);
await page.click("button[type='submit']");
await expect(page).toHaveURL("/home", { timeout: 10000 });
// Look for version in footer or about section
// The version might be in the footer or a settings page
// Check if footer exists and contains version info
const footer = page.locator("footer");
if ((await footer.count()) > 0) {
const footerText = await footer.textContent();
console.log("Footer text:", footerText);
// Version should be present in some form
// This is a basic check - the version format may vary
expect(footerText).toBeTruthy();
}
console.log("✓ Application version check complete");
});
test("verify locations are present", async ({ page }) => {
const firstUser = testData.users?.[0];
if (!firstUser) {
throw new Error("No users found in test data");
}
await page.goto("/");
await page.fill("input[type='text']", firstUser.email);
await page.fill("input[type='password']", firstUser.password);
await page.click("button[type='submit']");
await expect(page).toHaveURL("/home", { timeout: 10000 });
// Wait for page to load
await page.waitForSelector("body", { state: "visible" });
// Try to find locations link in navigation
const locationsLink = page.locator("a[href*='location'], button:has-text('Locations')").first();
if ((await locationsLink.count()) > 0) {
await locationsLink.click();
await page.waitForLoadState("networkidle");
// Check if locations are displayed
// The exact structure depends on the UI, but we should see location names
const pageContent = await page.textContent("body");
// Verify some of our test locations exist
expect(pageContent).toContain("Living Room");
console.log("✓ Locations verified");
} else {
console.log("! Could not find locations navigation - skipping detailed check");
}
});
test("verify labels are present", async ({ page }) => {
const firstUser = testData.users?.[0];
if (!firstUser) {
throw new Error("No users found in test data");
}
await page.goto("/");
await page.fill("input[type='text']", firstUser.email);
await page.fill("input[type='password']", firstUser.password);
await page.click("button[type='submit']");
await expect(page).toHaveURL("/home", { timeout: 10000 });
await page.waitForSelector("body", { state: "visible" });
// Try to find labels link in navigation
const labelsLink = page.locator("a[href*='label'], button:has-text('Labels')").first();
if ((await labelsLink.count()) > 0) {
await labelsLink.click();
await page.waitForLoadState("networkidle");
const pageContent = await page.textContent("body");
// Verify some of our test labels exist
expect(pageContent).toContain("Electronics");
console.log("✓ Labels verified");
} else {
console.log("! Could not find labels navigation - skipping detailed check");
}
});
test("verify items are present", async ({ page }) => {
const firstUser = testData.users?.[0];
if (!firstUser) {
throw new Error("No users found in test data");
}
await page.goto("/");
await page.fill("input[type='text']", firstUser.email);
await page.fill("input[type='password']", firstUser.password);
await page.click("button[type='submit']");
await expect(page).toHaveURL("/home", { timeout: 10000 });
await page.waitForSelector("body", { state: "visible" });
// Navigate to items list
// This might be the home page or a separate items page
const itemsLink = page.locator("a[href*='item'], button:has-text('Items')").first();
if ((await itemsLink.count()) > 0) {
await itemsLink.click();
await page.waitForLoadState("networkidle");
}
const pageContent = await page.textContent("body");
// Verify some of our test items exist
expect(pageContent).toContain("Laptop Computer");
console.log("✓ Items verified");
});
test("verify notifier is present", async ({ page }) => {
const firstUser = testData.users?.[0];
if (!firstUser) {
throw new Error("No users found in test data");
}
await page.goto("/");
await page.fill("input[type='text']", firstUser.email);
await page.fill("input[type='password']", firstUser.password);
await page.click("button[type='submit']");
await expect(page).toHaveURL("/home", { timeout: 10000 });
await page.waitForSelector("body", { state: "visible" });
// Navigate to settings or profile
// Notifiers are typically in settings
const settingsLink = page.locator("a[href*='setting'], a[href*='profile'], button:has-text('Settings')").first();
if ((await settingsLink.count()) > 0) {
await settingsLink.click();
await page.waitForLoadState("networkidle");
// Look for notifiers section
const notifiersLink = page.locator("a:has-text('Notif'), button:has-text('Notif')").first();
if ((await notifiersLink.count()) > 0) {
await notifiersLink.click();
await page.waitForLoadState("networkidle");
const pageContent = await page.textContent("body");
// Verify our test notifier exists
expect(pageContent).toContain("TESTING");
console.log("✓ Notifier verified");
} else {
console.log("! Could not find notifiers section - skipping detailed check");
}
} else {
console.log("! Could not find settings navigation - skipping notifier check");
}
});
test("verify attachments are present for items", async ({ page }) => {
const firstUser = testData.users?.[0];
if (!firstUser) {
throw new Error("No users found in test data");
}
await page.goto("/");
await page.fill("input[type='text']", firstUser.email);
await page.fill("input[type='password']", firstUser.password);
await page.click("button[type='submit']");
await expect(page).toHaveURL("/home", { timeout: 10000 });
await page.waitForSelector("body", { state: "visible" });
// Search for "Laptop Computer" which should have attachments
const searchInput = page.locator("input[type='search'], input[placeholder*='Search']").first();
if ((await searchInput.count()) > 0) {
await searchInput.fill("Laptop Computer");
await page.waitForLoadState("networkidle");
// Click on the laptop item
const laptopItem = page.locator("text=Laptop Computer").first();
await laptopItem.click();
await page.waitForLoadState("networkidle");
// Look for attachments section
const pageContent = await page.textContent("body");
// Check for attachment indicators (could be files, documents, attachments, etc.)
const hasAttachments =
pageContent?.includes("laptop-receipt") ||
pageContent?.includes("laptop-warranty") ||
pageContent?.includes("attachment") ||
pageContent?.includes("Attachment") ||
pageContent?.includes("document");
expect(hasAttachments).toBeTruthy();
console.log("✓ Attachments verified");
} else {
console.log("! Could not find search - trying direct navigation");
// Try alternative: look for items link and browse
const itemsLink = page.locator("a[href*='item'], button:has-text('Items')").first();
if ((await itemsLink.count()) > 0) {
await itemsLink.click();
await page.waitForLoadState("networkidle");
const laptopLink = page.locator("text=Laptop Computer").first();
if ((await laptopLink.count()) > 0) {
await laptopLink.click();
await page.waitForLoadState("networkidle");
const pageContent = await page.textContent("body");
const hasAttachments =
pageContent?.includes("laptop-receipt") ||
pageContent?.includes("laptop-warranty") ||
pageContent?.includes("attachment");
expect(hasAttachments).toBeTruthy();
console.log("✓ Attachments verified via direct navigation");
}
}
}
});
test("verify theme can be adjusted", async ({ page }) => {
const firstUser = testData.users?.[0];
if (!firstUser) {
throw new Error("No users found in test data");
}
await page.goto("/");
await page.fill("input[type='text']", firstUser.email);
await page.fill("input[type='password']", firstUser.password);
await page.click("button[type='submit']");
await expect(page).toHaveURL("/home", { timeout: 10000 });
await page.waitForSelector("body", { state: "visible" });
// Look for theme toggle (usually a sun/moon icon or settings)
// Common selectors for theme toggles
const themeToggle = page
.locator(
"button[aria-label*='theme'], button[aria-label*='Theme'], " +
"button:has-text('Dark'), button:has-text('Light'), " +
"[data-theme-toggle], .theme-toggle"
)
.first();
if ((await themeToggle.count()) > 0) {
// Get initial theme state (could be from class, attribute, or computed style)
const bodyBefore = page.locator("body");
const classNameBefore = (await bodyBefore.getAttribute("class")) || "";
// Click theme toggle
await themeToggle.click();
// Wait for theme change to complete
await page.waitForTimeout(500);
// Get theme state after toggle
const classNameAfter = (await bodyBefore.getAttribute("class")) || "";
// Verify that something changed
expect(classNameBefore).not.toBe(classNameAfter);
console.log(`✓ Theme toggle working (${classNameBefore} -> ${classNameAfter})`);
} else {
// Try to find theme in settings
const settingsLink = page.locator("a[href*='setting'], a[href*='profile']").first();
if ((await settingsLink.count()) > 0) {
await settingsLink.click();
await page.waitForLoadState("networkidle");
const themeOption = page.locator("select[name*='theme'], button:has-text('Theme')").first();
if ((await themeOption.count()) > 0) {
console.log("✓ Theme settings found");
} else {
console.log("! Could not find theme toggle - feature may not be easily accessible");
}
} else {
console.log("! Could not find theme controls");
}
}
});
test("verify data counts match expectations", async ({ page }) => {
const firstUser = testData.users?.[0];
if (!firstUser) {
throw new Error("No users found in test data");
}
await page.goto("/");
await page.fill("input[type='text']", firstUser.email);
await page.fill("input[type='password']", firstUser.password);
await page.click("button[type='submit']");
await expect(page).toHaveURL("/home", { timeout: 10000 });
await page.waitForSelector("body", { state: "visible" });
// Check that we have the expected number of items for group 1 (5 items)
const pageContent = await page.textContent("body");
// Look for item count indicators
// This is dependent on the UI showing counts
console.log("✓ Logged in and able to view dashboard");
// Verify at least that the page loaded and shows some content
expect(pageContent).toBeTruthy();
if (pageContent) {
expect(pageContent.length).toBeGreaterThan(100);
}
});
test("verify second group users and data isolation", async ({ page }) => {
// Login as user from group 2
const group2User = testData.users?.find(u => u.group === "2");
if (!group2User) {
console.log("! No group 2 users found - skipping isolation test");
return;
}
await page.goto("/");
await page.fill("input[type='text']", group2User.email);
await page.fill("input[type='password']", group2User.password);
await page.click("button[type='submit']");
await expect(page).toHaveURL("/home", { timeout: 10000 });
await page.waitForSelector("body", { state: "visible" });
const pageContent = await page.textContent("body");
// Verify group 2 can see their items
expect(pageContent).toContain("Monitor");
// Verify group 2 cannot see group 1 items
expect(pageContent).not.toContain("Laptop Computer");
console.log("✓ Data isolation verified between groups");
});
});