Device Testing Farm Orchestrator
Enterprise device testing farm management supporting real devices, emulators, simulators, and cloud testing services with automated test execution and comprehensive reporting.
Overview
The Device Testing Farm Orchestrator provides comprehensive device farm management for testing across multiple platforms, devices, and configurations. It supports both local device farms and cloud testing services integration.
Supported Device Types
📱 Mobile Devices
- Real Android Devices - Physical phones and tablets
- Android Emulators - Virtual Android devices
- Real iOS Devices - Physical iPhones and iPads
- iOS Simulators - Xcode simulator instances
🌐 Web Testing
- Desktop Browsers - Chrome, Firefox, Safari, Edge
- Mobile Browsers - Mobile Chrome, Safari Mobile
- Headless Testing - Puppeteer, Playwright support
☁️ Cloud Integration
- AWS Device Farm - Amazon's cloud testing service
- Firebase Test Lab - Google's device testing platform
- BrowserStack - Cross-browser cloud testing
- Sauce Labs - Continuous testing cloud
API Endpoints
List Available Devices
Response:
{
"devices": [
{
"id": "android_pixel6",
"name": "Google Pixel 6",
"platform": "android",
"device_type": "real_device",
"status": "available",
"capabilities": {
"os_version": "13",
"device_model": "Pixel 6",
"screen_resolution": [1080, 2400],
"ram_mb": 8192,
"form_factor": "phone"
},
"health_score": 95.5,
"usage_count": 127
}
]
}
Create Test Session
POST /test-sessions
Content-Type: application/json
{
"test_suite": "ecommerce-app-tests",
"framework": "appium",
"test_config": {
"platform": "android",
"app_path": "/path/to/app.apk",
"test_timeout": 1800
},
"device_requirements": {
"min_os_version": "10",
"device_type": "real_device",
"form_factor": "phone"
}
}
Response:
{
"message": "Test session created",
"session": {
"id": "session_789",
"test_suite": "ecommerce-app-tests",
"status": "queued",
"queue_position": 3,
"estimated_wait_time": "5 minutes"
}
}
Get Test Results
Response:
{
"id": "session_789",
"device_id": "android_pixel6",
"test_suite": "ecommerce-app-tests",
"status": "completed",
"start_time": "2025-01-15T10:30:00Z",
"end_time": "2025-01-15T10:45:00Z",
"duration": 900,
"results": {
"tests_passed": 45,
"tests_failed": 3,
"tests_skipped": 1,
"success_rate": 93.75
},
"artifacts": [
"session_789_screenshots.zip",
"session_789_logs.txt",
"session_789_video.mp4"
]
}
Get Queue Status
Response:
{
"queue_length": 5,
"queue": [
{
"id": "session_790",
"test_suite": "mobile-banking-tests",
"queue_position": 1,
"estimated_wait_time": 2
}
]
}
Usage Examples
Basic Test Execution
# Run tests on available Android device
unacode device-farm run-tests \
--test-suite "regression-tests" \
--framework "appium" \
--platform "android" \
--app-path "./app.apk"
# Run parallel tests across multiple devices
unacode device-farm run-parallel \
--test-suite "smoke-tests" \
--platforms "android,ios" \
--device-count 4
Python Integration
import asyncio
from unacode.orchestrators import DeviceTestingFarmOrchestrator
async def run_comprehensive_testing():
farm = DeviceTestingFarmOrchestrator()
# Create test session for mobile app
session = await farm.create_test_session(
test_suite="ecommerce-mobile-tests",
framework=TestFramework.APPIUM,
test_config={
"app_path": "/builds/ecommerce-v1.2.apk",
"test_timeout": 1800,
"screenshot_on_failure": True,
"video_recording": True
},
device_requirements={
"platform": "android",
"min_os_version": "10",
"device_type": "real_device"
}
)
print(f"Test session created: {session.id}")
print(f"Queue position: {session.queue_position}")
# Wait for completion
while session.status in ["queued", "running"]:
await asyncio.sleep(10)
# Refresh session status
session = await farm.get_test_session(session.id)
return session
Cross-Platform Testing
async def cross_platform_test_suite():
farm = DeviceTestingFarmOrchestrator()
test_platforms = [
{"platform": "android", "versions": ["11", "12", "13"]},
{"platform": "ios", "versions": ["15", "16", "17"]},
{"platform": "web", "browsers": ["chrome", "firefox", "safari"]}
]
sessions = []
for platform_config in test_platforms:
if platform_config["platform"] == "web":
for browser in platform_config["browsers"]:
session = await farm.create_test_session(
test_suite="web-compatibility-tests",
framework=TestFramework.SELENIUM,
test_config={
"browser": browser,
"viewport": {"width": 1920, "height": 1080}
}
)
sessions.append(session)
else:
for version in platform_config["versions"]:
session = await farm.create_test_session(
test_suite="mobile-compatibility-tests",
framework=TestFramework.APPIUM,
test_config={
"platform": platform_config["platform"],
"os_version": version
}
)
sessions.append(session)
# Wait for all sessions to complete
completed_sessions = await asyncio.gather(*[
wait_for_session_completion(session) for session in sessions
])
return generate_cross_platform_report(completed_sessions)
Framework Support
Appium (Mobile Testing)
# Android configuration
android_config = {
"framework": "appium",
"platform": "android",
"automation_name": "UiAutomator2",
"app": "/path/to/app.apk",
"device_name": "Samsung Galaxy S23",
"capabilities": {
"appPackage": "com.company.app",
"appActivity": ".MainActivity",
"noReset": True
}
}
# iOS configuration
ios_config = {
"framework": "appium",
"platform": "ios",
"automation_name": "XCUITest",
"app": "/path/to/app.ipa",
"device_name": "iPhone 14 Pro",
"capabilities": {
"bundleId": "com.company.app",
"udid": "device-uuid",
"xcodeOrgId": "team-id"
}
}
Selenium (Web Testing)
# Chrome configuration
chrome_config = {
"framework": "selenium",
"browser": "chrome",
"capabilities": {
"browserVersion": "latest",
"platformName": "Windows 10",
"screenResolution": "1920x1080"
},
"options": {
"headless": False,
"disable-gpu": True,
"no-sandbox": True
}
}
# Mobile web testing
mobile_web_config = {
"framework": "selenium",
"browser": "chrome",
"mobile_emulation": {
"deviceName": "iPhone 12 Pro",
"userAgent": "mobile-chrome-ios"
}
}
Playwright (Modern Web Testing)
# Multi-browser configuration
playwright_config = {
"framework": "playwright",
"browsers": ["chromium", "firefox", "webkit"],
"viewport": {"width": 1280, "height": 720},
"options": {
"headless": True,
"screenshot": "only-on-failure",
"video": "retain-on-failure"
}
}
Device Management
Health Monitoring
Response:
{
"device_id": "android_pixel6",
"timestamp": "2025-01-15T10:30:00Z",
"health_score": 95.5,
"metrics": {
"cpu_usage": 12.5,
"memory_usage": 45.2,
"disk_usage": 67.8,
"battery_level": 85.0,
"network_latency": 23.4,
"temperature": 35.2
},
"issues": [],
"recommendations": [
"Consider rebooting device after 50 more test sessions",
"Monitor temperature during intensive testing"
]
}
Device Provisioning
# Add new Android device
unacode device-farm add-device \
--device-id "samsung_s24" \
--platform "android" \
--device-name "Samsung Galaxy S24"
# Configure iOS simulator
unacode device-farm create-simulator \
--platform "ios" \
--os-version "17.0" \
--device-type "iPhone 15 Pro"
# Setup browser instances
unacode device-farm setup-browsers \
--browsers "chrome,firefox,edge" \
--instances-per-browser 3
Cloud Service Integration
AWS Device Farm
# Configure AWS Device Farm
aws_config = {
"provider": "aws_device_farm",
"region": "us-west-2",
"project_arn": "arn:aws:devicefarm:us-west-2:123456789012:project:uuid",
"credentials": {
"access_key_id": "AKIA...",
"secret_access_key": "...",
"session_token": "..."
}
}
# Run tests on AWS devices
aws_session = await farm.create_cloud_test_session(
provider="aws_device_farm",
test_config={
"app_upload": "/path/to/app.apk",
"test_spec": "standard-appium-python",
"device_pool": "android-devices-pool"
}
)
Firebase Test Lab
# Configure Firebase Test Lab
firebase_config = {
"provider": "firebase_test_lab",
"project_id": "my-firebase-project",
"service_account": "/path/to/service-account.json"
}
# Run matrix testing
matrix_config = {
"app": "/path/to/app.apk",
"test": "/path/to/test.apk",
"devices": [
{"model": "Pixel6", "version": "33", "locale": "en_US"},
{"model": "galaxy-s23", "version": "33", "locale": "en_US"}
]
}
Performance Testing
Load Testing
async def performance_load_test():
# Run concurrent sessions
concurrent_sessions = 10
session_tasks = []
for i in range(concurrent_sessions):
task = asyncio.create_task(
farm.create_test_session(
test_suite=f"load-test-{i}",
framework=TestFramework.APPIUM,
test_config={
"test_duration": 300, # 5 minutes
"performance_monitoring": True,
"memory_profiling": True
}
)
)
session_tasks.append(task)
# Wait for all sessions
sessions = await asyncio.gather(*session_tasks)
# Analyze performance metrics
return analyze_performance_results(sessions)
Resource Monitoring
# Monitor device resources during testing
resource_config = {
"cpu_monitoring": True,
"memory_tracking": True,
"battery_analysis": True,
"network_usage": True,
"temperature_alerts": True,
"performance_thresholds": {
"max_cpu": 90.0,
"max_memory": 85.0,
"max_temperature": 45.0
}
}
CI/CD Integration
GitHub Actions
# .github/workflows/device-testing.yml
name: Device Farm Testing
on: [push, pull_request]
jobs:
device-tests:
runs-on: ubuntu-latest
strategy:
matrix:
platform: [android, ios]
device-type: [real_device, simulator]
steps:
- uses: actions/checkout@v3
- name: Build App
run: |
# Build app for testing
./build-app.sh ${{ matrix.platform }}
- name: Run Device Farm Tests
run: |
unacode device-farm run-tests \
--test-suite "ci-regression-tests" \
--platform "${{ matrix.platform }}" \
--device-type "${{ matrix.device-type }}" \
--app-path "./builds/app-${{ matrix.platform }}.apk" \
--wait-for-completion \
--generate-report
- name: Upload Test Results
uses: actions/upload-artifact@v3
if: always()
with:
name: test-results-${{ matrix.platform }}-${{ matrix.device-type }}
path: test-results/
Dashboard Features
The integrated dashboard provides:
- Device Farm Status - Real-time device availability and health
- Test Queue Management - Monitor and prioritize test execution
- Performance Analytics - Device utilization and performance metrics
- Test Results Visualization - Comprehensive test reporting and trends
- Resource Monitoring - Track device resources and optimization opportunities
Server Configuration
# Start the device farm server
python -m unacode.master_orchestrators.device_testing_farm_orchestrator \
--host 0.0.0.0 \
--port 8082 \
--config device_farm_config.json
# Access the dashboard
open http://localhost:8082/dashboard
Best Practices
1. Device Pool Management
- Maintain diverse device pool covering popular models
- Regular health checks and maintenance schedules
- Monitor usage patterns and optimize allocation
2. Test Optimization
- Implement smart test scheduling based on device availability
- Use parallel execution to maximize throughput
- Prioritize critical tests during peak hours
3. Resource Efficiency
- Monitor device utilization and optimize test distribution
- Implement automatic cleanup and recovery procedures
- Track cost per test session for budget optimization
4. Quality Assurance
- Establish baseline performance metrics
- Implement automated failure analysis and reporting
- Regular device calibration and performance validation
Ready to scale your testing with a comprehensive device farm? Start with the Quick Start Guide or explore the API Documentation for integration details.