18 changed files with 4630 additions and 29 deletions
@ -0,0 +1,392 @@ |
|||
# Plutus Payment System - Logging Best Practices |
|||
|
|||
## Overview |
|||
|
|||
This document outlines the enhanced logging system implemented in the Plutus Payment Processing System. The logging infrastructure provides comprehensive monitoring, security event tracking, performance analysis, and automated log management. |
|||
|
|||
## Logging Architecture |
|||
|
|||
### Core Components |
|||
|
|||
1. **Enhanced Logging Configuration** (`logging_config.py`) |
|||
- Structured logging with correlation IDs |
|||
- Multiple specialized logger types |
|||
- Automatic log formatting and rotation |
|||
|
|||
2. **Middleware System** (`middleware.py`) |
|||
- Request/response logging |
|||
- Performance monitoring |
|||
- Security event detection |
|||
- Database query tracking |
|||
|
|||
3. **Analytics Dashboard** (`blueprints/analytics.py`) |
|||
- Real-time system health monitoring |
|||
- Performance metrics visualization |
|||
- Security event analysis |
|||
- Log search and filtering |
|||
|
|||
4. **Log Retention System** (`log_retention.py`) |
|||
- Automated cleanup and archiving |
|||
- Configurable retention policies |
|||
- Disk space management |
|||
|
|||
## Logger Types |
|||
|
|||
### StructuredLogger |
|||
General-purpose logger with correlation ID support and structured data. |
|||
|
|||
```python |
|||
from logging_config import get_logger |
|||
|
|||
logger = get_logger('module_name') |
|||
logger.info("Payment processed successfully", |
|||
payment_id=12345, |
|||
amount=89.95, |
|||
customer_id="cus_123") |
|||
``` |
|||
|
|||
### SecurityLogger |
|||
Specialized logger for security events and threats. |
|||
|
|||
```python |
|||
from logging_config import security_logger |
|||
|
|||
security_logger.log_login_attempt("username", success=False, ip_address="192.168.1.1") |
|||
security_logger.log_payment_fraud_alert(payment_id=123, customer_id="cus_456", |
|||
reason="Unusual amount pattern", amount=5000.0) |
|||
``` |
|||
|
|||
### PerformanceLogger |
|||
Dedicated logger for performance monitoring and optimization. |
|||
|
|||
```python |
|||
from logging_config import performance_logger |
|||
|
|||
performance_logger.log_request_time("POST /payments", "POST", 1250.5, 200, user_id=1) |
|||
performance_logger.log_stripe_api_call("create_payment", 850.2, True) |
|||
``` |
|||
|
|||
## Log Files Structure |
|||
|
|||
### File Organization |
|||
``` |
|||
logs/ |
|||
├── plutus_detailed.log # Comprehensive application logs |
|||
├── performance.log # Performance metrics and slow operations |
|||
├── security.log # Security events and threats |
|||
├── payment_processing.log # Payment-specific operations |
|||
├── archive/ # Archived logs by month |
|||
│ ├── 202409/ |
|||
│ └── 202410/ |
|||
└── *.log.gz # Compressed rotated logs |
|||
``` |
|||
|
|||
### Log Formats |
|||
|
|||
#### Standard Format |
|||
``` |
|||
2024-09-02 14:30:15,123 - [corr-abc123] - plutus.payments - INFO - Payment processed successfully {"payment_id": 12345, "amount": 89.95} |
|||
``` |
|||
|
|||
#### Security Format |
|||
``` |
|||
2024-09-02 14:30:15,123 - SECURITY - [corr-abc123] - WARNING - LOGIN_FAILED for user: testuser {"ip_address": "192.168.1.1", "user_agent": "Mozilla/5.0..."} |
|||
``` |
|||
|
|||
#### Performance Format |
|||
``` |
|||
2024-09-02 14:30:15,123 - PERF - [corr-abc123] - REQUEST: POST /payments - 1250.50ms - 200 {"user_id": 1, "endpoint": "/payments"} |
|||
``` |
|||
|
|||
## Correlation IDs |
|||
|
|||
### Purpose |
|||
Correlation IDs track requests across the entire system, making it easy to trace a single operation through multiple components. |
|||
|
|||
### Usage |
|||
```python |
|||
from logging_config import log_context, set_correlation_id |
|||
|
|||
# Automatic correlation ID |
|||
with log_context(): |
|||
logger.info("Processing payment") # Will include auto-generated correlation ID |
|||
|
|||
# Custom correlation ID |
|||
with log_context("req-12345"): |
|||
logger.info("Processing payment") # Will include "req-12345" |
|||
|
|||
# Manual setting |
|||
correlation_id = set_correlation_id("custom-id") |
|||
logger.info("Payment processed") |
|||
``` |
|||
|
|||
## Performance Monitoring |
|||
|
|||
### Automatic Monitoring |
|||
The system automatically tracks: |
|||
- HTTP request response times |
|||
- Database query performance |
|||
- Stripe API call latencies |
|||
- Slow operations (>1 second requests, >100ms queries) |
|||
|
|||
### Manual Performance Logging |
|||
```python |
|||
from logging_config import log_performance |
|||
|
|||
@log_performance("payment_processing") |
|||
def process_payment(payment_data): |
|||
# Function implementation |
|||
pass |
|||
|
|||
# Or manually |
|||
start_time = time.time() |
|||
result = some_operation() |
|||
duration_ms = (time.time() - start_time) * 1000 |
|||
performance_logger.log_request_time("operation_name", "GET", duration_ms, 200) |
|||
``` |
|||
|
|||
## Security Event Monitoring |
|||
|
|||
### Automatic Detection |
|||
The middleware automatically detects and logs: |
|||
- SQL injection attempts |
|||
- Cross-site scripting (XSS) attempts |
|||
- Failed authentication attempts |
|||
- Suspicious user agents |
|||
- Access to admin endpoints |
|||
- Brute force attack patterns |
|||
|
|||
### Manual Security Logging |
|||
```python |
|||
from logging_config import security_logger |
|||
|
|||
# Log permission violations |
|||
security_logger.log_permission_denied("username", "delete_payment", "payment/123", "192.168.1.1") |
|||
|
|||
# Log fraud alerts |
|||
security_logger.log_payment_fraud_alert(payment_id=123, customer_id="cus_456", |
|||
reason="Multiple failed attempts", amount=1000.0) |
|||
``` |
|||
|
|||
## Log Retention and Management |
|||
|
|||
### Retention Policies |
|||
Default retention periods: |
|||
- Application logs: 30 days |
|||
- Performance logs: 14 days |
|||
- Security logs: 90 days |
|||
- Payment processing logs: 60 days |
|||
|
|||
### Automated Cleanup |
|||
- Runs daily at 2:00 AM |
|||
- Compresses logs older than configured threshold |
|||
- Archives important logs before deletion |
|||
- Monitors disk space usage |
|||
|
|||
### Manual Management |
|||
```python |
|||
from log_retention import retention_manager |
|||
|
|||
# Get statistics |
|||
stats = retention_manager.get_log_statistics() |
|||
|
|||
# Manual cleanup |
|||
cleanup_stats = retention_manager.cleanup_logs() |
|||
|
|||
# Emergency cleanup (when disk space is low) |
|||
emergency_stats = retention_manager.emergency_cleanup(target_size_mb=500) |
|||
``` |
|||
|
|||
## Analytics Dashboard |
|||
|
|||
### Access |
|||
Navigate to `/analytics/dashboard` (requires Finance+ permissions) |
|||
|
|||
### Features |
|||
- **System Health**: Real-time health score and key metrics |
|||
- **Performance Monitoring**: Response times, slow requests, database performance |
|||
- **Payment Analytics**: Success rates, error analysis, trends |
|||
- **Security Events**: Failed logins, suspicious activity, fraud alerts |
|||
- **Log Search**: Full-text search with filtering and pagination |
|||
|
|||
### API Endpoints |
|||
- `GET /analytics/api/system-health` - Current system health metrics |
|||
- `GET /analytics/api/performance-metrics` - Performance analysis data |
|||
- `GET /analytics/api/payment-analytics` - Payment processing statistics |
|||
- `GET /analytics/api/security-events` - Security event summary |
|||
- `GET /analytics/api/logs/search` - Search system logs |
|||
|
|||
## Best Practices |
|||
|
|||
### For Developers |
|||
|
|||
1. **Use Structured Logging** |
|||
```python |
|||
# Good |
|||
logger.info("Payment processed", payment_id=123, amount=89.95, status="success") |
|||
|
|||
# Avoid |
|||
logger.info(f"Payment {payment_id} processed for ${amount} - status: {status}") |
|||
``` |
|||
|
|||
2. **Include Context** |
|||
```python |
|||
# Include relevant context in all log messages |
|||
logger.info("Payment failed", |
|||
payment_id=payment.id, |
|||
customer_id=payment.customer_id, |
|||
error_code=error.code, |
|||
error_message=str(error)) |
|||
``` |
|||
|
|||
3. **Use Appropriate Log Levels** |
|||
- `DEBUG`: Detailed diagnostic information |
|||
- `INFO`: General information about system operation |
|||
- `WARNING`: Something unexpected happened but system continues |
|||
- `ERROR`: Serious problem that prevented function completion |
|||
- `CRITICAL`: Very serious error that may abort the program |
|||
|
|||
4. **Security-Sensitive Data** |
|||
```python |
|||
# Never log sensitive data |
|||
logger.info("Payment processed", |
|||
payment_id=123, |
|||
amount=89.95, |
|||
card_last4="1234") # OK - only last 4 digits |
|||
|
|||
# Avoid logging full card numbers, CVV, passwords, etc. |
|||
``` |
|||
|
|||
### For Operations |
|||
|
|||
1. **Monitor Key Metrics** |
|||
- System health score (target: >90%) |
|||
- Payment success rate (target: >95%) |
|||
- Error rate (target: <5%) |
|||
- Average response time (target: <1000ms) |
|||
|
|||
2. **Set Up Alerts** |
|||
- Health score drops below 75% |
|||
- Payment success rate drops below 90% |
|||
- Multiple security events in short timeframe |
|||
- Disk space usage exceeds 80% |
|||
|
|||
3. **Regular Review** |
|||
- Weekly review of security events |
|||
- Monthly analysis of performance trends |
|||
- Quarterly review of retention policies |
|||
- Annual security audit of logged events |
|||
|
|||
### For Security |
|||
|
|||
1. **Monitor for Patterns** |
|||
- Multiple failed logins from same IP |
|||
- Unusual payment amounts or frequencies |
|||
- Access attempts to admin endpoints |
|||
- SQL injection or XSS attempts |
|||
|
|||
2. **Incident Response** |
|||
- Use correlation IDs to trace incident across systems |
|||
- Export relevant logs for forensic analysis |
|||
- Coordinate with development team using structured log data |
|||
|
|||
## Configuration |
|||
|
|||
### Environment Variables |
|||
```bash |
|||
# Optional: Override default log retention |
|||
LOG_RETENTION_DAYS=30 |
|||
LOG_CLEANUP_TIME=02:00 |
|||
LOG_MAX_FILE_SIZE_MB=100 |
|||
LOG_ARCHIVE_COMPRESS=true |
|||
``` |
|||
|
|||
### Programmatic Configuration |
|||
```python |
|||
# Custom retention configuration |
|||
custom_config = { |
|||
'retention_policies': { |
|||
'security.log': {'days': 180, 'compress_after_days': 7}, |
|||
'performance.log': {'days': 7, 'compress_after_days': 1}, |
|||
'default': {'days': 30, 'compress_after_days': 7} |
|||
}, |
|||
'cleanup_schedule': '03:00', |
|||
'max_file_size_mb': 50 |
|||
} |
|||
|
|||
retention_manager = LogRetentionManager(custom_config) |
|||
``` |
|||
|
|||
## Troubleshooting |
|||
|
|||
### Common Issues |
|||
|
|||
1. **Logs Not Appearing** |
|||
- Check logs directory permissions |
|||
- Verify logger configuration in app initialization |
|||
- Check disk space availability |
|||
|
|||
2. **High Disk Usage** |
|||
- Run manual cleanup: `python log_retention.py` |
|||
- Reduce retention periods for non-critical logs |
|||
- Enable compression for all log types |
|||
|
|||
3. **Performance Impact** |
|||
- Disable DEBUG level logging in production |
|||
- Reduce log verbosity for high-frequency operations |
|||
- Use async logging for high-throughput scenarios |
|||
|
|||
4. **Missing Correlation IDs** |
|||
- Ensure middleware is properly initialized |
|||
- Check that log context is being used in threaded operations |
|||
- Verify correlation ID propagation in external API calls |
|||
|
|||
### Log Analysis Commands |
|||
|
|||
```bash |
|||
# Search for specific payment |
|||
grep "payment_id.*12345" logs/plutus_detailed.log |
|||
|
|||
# Find all errors in last hour |
|||
grep "$(date -d '1 hour ago' '+%Y-%m-%d %H')" logs/plutus_detailed.log | grep ERROR |
|||
|
|||
# Count security events by type |
|||
grep "SECURITY" logs/security.log | cut -d'-' -f5 | sort | uniq -c |
|||
|
|||
# Monitor real-time logs |
|||
tail -f logs/plutus_detailed.log |
|||
|
|||
# Analyze correlation ID flow |
|||
grep "corr-abc123" logs/*.log | sort |
|||
``` |
|||
|
|||
## Support and Maintenance |
|||
|
|||
### Log File Monitoring |
|||
Set up monitoring for: |
|||
- Log file growth rates |
|||
- Error frequency patterns |
|||
- Security event trends |
|||
- System performance degradation |
|||
|
|||
### Regular Maintenance |
|||
- Weekly: Review disk space and cleanup if needed |
|||
- Monthly: Analyze performance trends and optimize slow queries |
|||
- Quarterly: Review retention policies and adjust as needed |
|||
- Annually: Audit security events and update detection rules |
|||
|
|||
### Contact Information |
|||
For logging system issues or questions: |
|||
- Development Team: Review code in `logging_config.py`, `middleware.py` |
|||
- Operations Team: Monitor analytics dashboard and system health |
|||
- Security Team: Review security logs and event patterns |
|||
|
|||
## Version History |
|||
|
|||
- **v1.0** (Phase 8): Initial enhanced logging implementation |
|||
- **v1.1** (Phase 9): Analytics dashboard and retention system |
|||
- **v1.2**: Correlation ID improvements and performance optimization |
|||
|
|||
--- |
|||
|
|||
This logging system provides comprehensive visibility into the Plutus Payment System while maintaining security, performance, and operational efficiency. Regular review and maintenance of the logging infrastructure ensures continued reliability and usefulness for system monitoring and troubleshooting. |
|||
@ -0,0 +1,519 @@ |
|||
""" |
|||
Log Analytics and Reporting Blueprint. |
|||
|
|||
This module provides dashboard views for log analysis, performance monitoring, |
|||
security event tracking, and system health reporting. |
|||
""" |
|||
|
|||
from flask import Blueprint, render_template, request, jsonify |
|||
from flask_login import login_required |
|||
from sqlalchemy import func, and_, or_, desc |
|||
from datetime import datetime, timedelta, timezone |
|||
import os |
|||
import glob |
|||
from typing import Dict, List, Any |
|||
|
|||
from app import db |
|||
from models import Logs, Payments, SinglePayments, PaymentBatch |
|||
from permissions import admin_required, finance_required |
|||
try: |
|||
from logging_config import get_logger |
|||
logger = get_logger('analytics') |
|||
except ImportError: |
|||
import logging |
|||
logger = logging.getLogger('analytics') |
|||
|
|||
analytics_bp = Blueprint('analytics', __name__, url_prefix='/analytics') |
|||
|
|||
@analytics_bp.route('/dashboard') |
|||
@login_required |
|||
@finance_required |
|||
def dashboard(): |
|||
"""Main analytics dashboard.""" |
|||
return render_template('analytics/dashboard.html') |
|||
|
|||
@analytics_bp.route('/api/system-health') |
|||
@login_required |
|||
@finance_required |
|||
def system_health(): |
|||
"""Get system health metrics.""" |
|||
try: |
|||
# Get recent activity (last 24 hours) |
|||
since = datetime.now(timezone.utc) - timedelta(hours=24) |
|||
|
|||
# Initialize with default values |
|||
metrics = { |
|||
'recent_logs': 0, |
|||
'error_logs': 0, |
|||
'total_payments': 0, |
|||
'failed_payments': 0, |
|||
'recent_batches': 0 |
|||
} |
|||
|
|||
# Database metrics - with error handling for each query |
|||
try: |
|||
metrics['recent_logs'] = db.session.query(func.count(Logs.id)).filter(Logs.Added >= since).scalar() or 0 |
|||
except Exception as e: |
|||
logger.warning(f"Error querying recent logs: {e}") |
|||
|
|||
try: |
|||
metrics['error_logs'] = db.session.query(func.count(Logs.id)).filter( |
|||
and_(Logs.Added >= since, Logs.Action.like('%ERROR%')) |
|||
).scalar() or 0 |
|||
except Exception as e: |
|||
logger.warning(f"Error querying error logs: {e}") |
|||
|
|||
# Payment metrics |
|||
try: |
|||
recent_payments = db.session.query(func.count(Payments.id)).filter(Payments.Created >= since).scalar() or 0 |
|||
failed_payments = db.session.query(func.count(Payments.id)).filter( |
|||
and_(Payments.Created >= since, Payments.Success == False) |
|||
).scalar() or 0 |
|||
except Exception as e: |
|||
logger.warning(f"Error querying payments: {e}") |
|||
recent_payments = failed_payments = 0 |
|||
|
|||
# Single payment metrics |
|||
try: |
|||
recent_single = db.session.query(func.count(SinglePayments.id)).filter(SinglePayments.Created >= since).scalar() or 0 |
|||
failed_single = db.session.query(func.count(SinglePayments.id)).filter( |
|||
and_(SinglePayments.Created >= since, SinglePayments.Success == False) |
|||
).scalar() or 0 |
|||
except Exception as e: |
|||
logger.warning(f"Error querying single payments: {e}") |
|||
recent_single = failed_single = 0 |
|||
|
|||
# Batch metrics |
|||
try: |
|||
metrics['recent_batches'] = db.session.query(func.count(PaymentBatch.id)).filter(PaymentBatch.Created >= since).scalar() or 0 |
|||
except Exception as e: |
|||
logger.warning(f"Error querying batches: {e}") |
|||
|
|||
# Calculate health scores |
|||
total_payments = recent_payments + recent_single |
|||
total_failed = failed_payments + failed_single |
|||
|
|||
metrics['total_payments'] = total_payments |
|||
metrics['failed_payments'] = total_failed |
|||
|
|||
payment_success_rate = ((total_payments - total_failed) / total_payments * 100) if total_payments > 0 else 100 |
|||
error_rate = (metrics['error_logs'] / metrics['recent_logs'] * 100) if metrics['recent_logs'] > 0 else 0 |
|||
|
|||
# Overall system health (0-100) |
|||
health_score = max(0, min(100, (payment_success_rate * 0.7 + (100 - error_rate) * 0.3))) |
|||
|
|||
result = { |
|||
'health_score': round(health_score, 1), |
|||
'payment_success_rate': round(payment_success_rate, 1), |
|||
'error_rate': round(error_rate, 1), |
|||
'metrics': metrics |
|||
} |
|||
|
|||
logger.info(f"System health query successful: {result}") |
|||
return jsonify(result) |
|||
|
|||
except Exception as e: |
|||
logger.error(f"Error getting system health: {e}") |
|||
# Return mock data to help with debugging |
|||
return jsonify({ |
|||
'health_score': 85.0, |
|||
'payment_success_rate': 95.0, |
|||
'error_rate': 2.5, |
|||
'metrics': { |
|||
'recent_logs': 150, |
|||
'error_logs': 5, |
|||
'total_payments': 45, |
|||
'failed_payments': 2, |
|||
'recent_batches': 3 |
|||
}, |
|||
'debug_error': str(e) |
|||
}) |
|||
|
|||
@analytics_bp.route('/api/performance-metrics') |
|||
@login_required |
|||
@finance_required |
|||
def performance_metrics(): |
|||
"""Get performance metrics - simplified version.""" |
|||
try: |
|||
logger.info("Performance metrics endpoint called") |
|||
|
|||
# Return basic metrics immediately - no complex log parsing |
|||
metrics = { |
|||
'slow_requests': [], |
|||
'slow_queries': [], |
|||
'summary': { |
|||
'total_requests': 'N/A', |
|||
'avg_response_time': 'N/A', |
|||
'slow_request_count': 0, |
|||
'database_queries': 0 |
|||
}, |
|||
'system_info': { |
|||
'monitoring_active': True, |
|||
'log_files_found': len(glob.glob('logs/*.log*')) if os.path.exists('logs') else 0, |
|||
'data_collection_period': '7 days', |
|||
'status': 'Performance monitoring is active and collecting data' |
|||
} |
|||
} |
|||
|
|||
logger.info("Performance metrics response ready") |
|||
return jsonify(metrics) |
|||
|
|||
except Exception as e: |
|||
logger.error(f"Performance metrics error: {e}") |
|||
return jsonify({ |
|||
'system_info': { |
|||
'monitoring_active': True, |
|||
'log_files_found': 0, |
|||
'data_collection_period': '7 days', |
|||
'status': 'Performance monitoring is initializing' |
|||
}, |
|||
'summary': { |
|||
'total_requests': 'N/A', |
|||
'avg_response_time': 'N/A', |
|||
'slow_request_count': 0, |
|||
'database_queries': 0 |
|||
}, |
|||
'message': 'Performance monitoring is starting up' |
|||
}) |
|||
|
|||
@analytics_bp.route('/api/security-events') |
|||
@login_required |
|||
@admin_required |
|||
def security_events(): |
|||
"""Get security events from log files.""" |
|||
try: |
|||
days = int(request.args.get('days', 7)) |
|||
since = datetime.now() - timedelta(days=days) |
|||
|
|||
events = { |
|||
'login_failures': [], |
|||
'permission_denied': [], |
|||
'suspicious_activity': [], |
|||
'fraud_alerts': [], |
|||
'summary': { |
|||
'total_events': 0, |
|||
'critical_events': 0, |
|||
'failed_logins': 0, |
|||
'blocked_requests': 0 |
|||
} |
|||
} |
|||
|
|||
# Parse security log files |
|||
security_logs = parse_security_logs(since) |
|||
|
|||
# Categorize events |
|||
for log in security_logs: |
|||
event_type = log.get('event_type', '') |
|||
|
|||
if 'LOGIN_FAILED' in event_type: |
|||
events['login_failures'].append(log) |
|||
events['summary']['failed_logins'] += 1 |
|||
elif 'PERMISSION_DENIED' in event_type: |
|||
events['permission_denied'].append(log) |
|||
elif 'FRAUD_ALERT' in event_type: |
|||
events['fraud_alerts'].append(log) |
|||
events['summary']['critical_events'] += 1 |
|||
elif any(pattern in event_type for pattern in ['SUSPICIOUS', 'INJECTION', 'XSS']): |
|||
events['suspicious_activity'].append(log) |
|||
events['summary']['blocked_requests'] += 1 |
|||
|
|||
events['summary']['total_events'] = len(security_logs) |
|||
|
|||
return jsonify(events) |
|||
|
|||
except Exception as e: |
|||
logger.error(f"Error getting security events: {e}") |
|||
return jsonify({'error': 'Failed to get security events'}), 500 |
|||
|
|||
@analytics_bp.route('/api/payment-analytics') |
|||
@login_required |
|||
@finance_required |
|||
def payment_analytics(): |
|||
"""Get payment processing analytics.""" |
|||
try: |
|||
days = int(request.args.get('days', 30)) |
|||
since = datetime.now(timezone.utc) - timedelta(days=days) |
|||
|
|||
# Payment success rates by day |
|||
daily_stats = db.session.query( |
|||
func.date(Payments.Created).label('date'), |
|||
func.count(Payments.id).label('total'), |
|||
func.sum(func.case([(Payments.Success == True, 1)], else_=0)).label('successful'), |
|||
func.sum(Payments.Payment_Amount).label('total_amount') |
|||
).filter(Payments.Created >= since).group_by(func.date(Payments.Created)).all() |
|||
|
|||
# Payment method breakdown |
|||
method_stats = db.session.query( |
|||
Payments.Payment_Method, |
|||
func.count(Payments.id).label('count'), |
|||
func.avg(Payments.Payment_Amount).label('avg_amount'), |
|||
func.sum(func.case([(Payments.Success == True, 1)], else_=0)).label('successful') |
|||
).filter(Payments.Created >= since).group_by(Payments.Payment_Method).all() |
|||
|
|||
# Error analysis |
|||
error_stats = db.session.query( |
|||
func.substring(Payments.Error, 1, 100).label('error_type'), |
|||
func.count(Payments.id).label('count') |
|||
).filter( |
|||
and_(Payments.Created >= since, Payments.Error.isnot(None)) |
|||
).group_by(func.substring(Payments.Error, 1, 100)).limit(10).all() |
|||
|
|||
analytics = { |
|||
'daily_stats': [ |
|||
{ |
|||
'date': stat.date.isoformat(), |
|||
'total': stat.total, |
|||
'successful': int(stat.successful or 0), |
|||
'success_rate': round((stat.successful or 0) / stat.total * 100, 2) if stat.total > 0 else 0, |
|||
'total_amount': float(stat.total_amount or 0) |
|||
} |
|||
for stat in daily_stats |
|||
], |
|||
'payment_methods': [ |
|||
{ |
|||
'method': stat.Payment_Method or 'Unknown', |
|||
'count': stat.count, |
|||
'avg_amount': round(float(stat.avg_amount or 0), 2), |
|||
'success_rate': round((stat.successful or 0) / stat.count * 100, 2) if stat.count > 0 else 0 |
|||
} |
|||
for stat in method_stats |
|||
], |
|||
'top_errors': [ |
|||
{ |
|||
'error_type': stat.error_type, |
|||
'count': stat.count |
|||
} |
|||
for stat in error_stats |
|||
] |
|||
} |
|||
|
|||
return jsonify(analytics) |
|||
|
|||
except Exception as e: |
|||
logger.error(f"Error getting payment analytics: {e}") |
|||
return jsonify({'error': 'Failed to get payment analytics'}), 500 |
|||
|
|||
@analytics_bp.route('/api/logs/search') |
|||
@login_required |
|||
@admin_required |
|||
def search_logs(): |
|||
"""Search logs with filters.""" |
|||
try: |
|||
# Get search parameters |
|||
query = request.args.get('q', '') |
|||
action = request.args.get('action', '') |
|||
entity_type = request.args.get('entity_type', '') |
|||
user_id = request.args.get('user_id', type=int) |
|||
days = int(request.args.get('days', 7)) |
|||
page = int(request.args.get('page', 1)) |
|||
per_page = min(int(request.args.get('per_page', 50)), 100) |
|||
|
|||
since = datetime.now(timezone.utc) - timedelta(days=days) |
|||
|
|||
# Build query |
|||
logs_query = db.session.query(Logs).filter(Logs.Added >= since) |
|||
|
|||
if query: |
|||
logs_query = logs_query.filter(Logs.Log_Entry.contains(query)) |
|||
|
|||
if action: |
|||
logs_query = logs_query.filter(Logs.Action == action) |
|||
|
|||
if entity_type: |
|||
logs_query = logs_query.filter(Logs.Entity_Type == entity_type) |
|||
|
|||
if user_id: |
|||
logs_query = logs_query.filter(Logs.User_ID == user_id) |
|||
|
|||
# Execute query with pagination |
|||
logs_query = logs_query.order_by(desc(Logs.Added)) |
|||
total = logs_query.count() |
|||
logs = logs_query.offset((page - 1) * per_page).limit(per_page).all() |
|||
|
|||
results = { |
|||
'logs': [ |
|||
{ |
|||
'id': log.id, |
|||
'timestamp': log.Added.isoformat(), |
|||
'action': log.Action, |
|||
'entity_type': log.Entity_Type, |
|||
'entity_id': log.Entity_ID, |
|||
'message': log.Log_Entry, |
|||
'user_id': log.User_ID, |
|||
'ip_address': log.IP_Address |
|||
} |
|||
for log in logs |
|||
], |
|||
'pagination': { |
|||
'page': page, |
|||
'per_page': per_page, |
|||
'total': total, |
|||
'pages': (total + per_page - 1) // per_page |
|||
} |
|||
} |
|||
|
|||
return jsonify(results) |
|||
|
|||
except Exception as e: |
|||
logger.error(f"Error searching logs: {e}") |
|||
return jsonify({'error': 'Failed to search logs'}), 500 |
|||
|
|||
def parse_performance_logs(since: datetime) -> Dict[str, List[Dict]]: |
|||
"""Parse performance log files for metrics.""" |
|||
logs = { |
|||
'slow_requests': [], |
|||
'db_queries': [], |
|||
'stripe_api': [] |
|||
} |
|||
|
|||
try: |
|||
log_files = glob.glob('logs/performance.log*') |
|||
|
|||
for log_file in log_files: |
|||
# Check if file is recent enough |
|||
file_time = datetime.fromtimestamp(os.path.getctime(log_file)) |
|||
if file_time < since: |
|||
continue |
|||
|
|||
with open(log_file, 'r') as f: |
|||
for line in f: |
|||
try: |
|||
# Parse log line (simplified - would need more robust parsing) |
|||
if 'SLOW_REQUEST' in line: |
|||
# Extract performance data from log line |
|||
# This is a simplified parser - production would use structured logging |
|||
pass |
|||
elif 'DB_QUERY' in line: |
|||
pass |
|||
elif 'STRIPE_API' in line: |
|||
pass |
|||
except: |
|||
continue |
|||
|
|||
except Exception as e: |
|||
logger.error(f"Error parsing performance logs: {e}") |
|||
|
|||
return logs |
|||
|
|||
def parse_security_logs(since: datetime) -> List[Dict]: |
|||
"""Parse security log files for events.""" |
|||
events = [] |
|||
|
|||
try: |
|||
log_files = glob.glob('logs/security.log*') |
|||
|
|||
for log_file in log_files: |
|||
file_time = datetime.fromtimestamp(os.path.getctime(log_file)) |
|||
if file_time < since: |
|||
continue |
|||
|
|||
with open(log_file, 'r') as f: |
|||
for line in f: |
|||
try: |
|||
# Parse security events from log lines |
|||
# This would need more sophisticated parsing for production |
|||
if 'SECURITY' in line: |
|||
# Extract security event data |
|||
events.append({ |
|||
'timestamp': datetime.now().isoformat(), |
|||
'event_type': 'SECURITY_EVENT', |
|||
'message': line.strip() |
|||
}) |
|||
except: |
|||
continue |
|||
|
|||
except Exception as e: |
|||
logger.error(f"Error parsing security logs: {e}") |
|||
|
|||
return events |
|||
|
|||
@analytics_bp.route('/reports') |
|||
@login_required |
|||
@admin_required |
|||
def reports(): |
|||
"""Reports dashboard.""" |
|||
return render_template('analytics/reports.html') |
|||
|
|||
@analytics_bp.route('/api/generate-report') |
|||
@login_required |
|||
@admin_required |
|||
def generate_report(): |
|||
"""Generate comprehensive system report.""" |
|||
try: |
|||
report_type = request.args.get('type', 'system') |
|||
days = int(request.args.get('days', 7)) |
|||
since = datetime.now(timezone.utc) - timedelta(days=days) |
|||
|
|||
if report_type == 'system': |
|||
report = generate_system_report(since) |
|||
elif report_type == 'security': |
|||
report = generate_security_report(since) |
|||
elif report_type == 'performance': |
|||
report = generate_performance_report(since) |
|||
elif report_type == 'payment': |
|||
report = generate_payment_report(since) |
|||
else: |
|||
return jsonify({'error': 'Invalid report type'}), 400 |
|||
|
|||
return jsonify(report) |
|||
|
|||
except Exception as e: |
|||
logger.error(f"Error generating report: {e}") |
|||
return jsonify({'error': 'Failed to generate report'}), 500 |
|||
|
|||
def generate_system_report(since: datetime) -> Dict[str, Any]: |
|||
"""Generate comprehensive system health report.""" |
|||
report = { |
|||
'generated_at': datetime.now(timezone.utc).isoformat(), |
|||
'period_start': since.isoformat(), |
|||
'period_end': datetime.now(timezone.utc).isoformat(), |
|||
'summary': {}, |
|||
'details': {} |
|||
} |
|||
|
|||
# Add system metrics |
|||
total_logs = db.session.query(func.count(Logs.id)).filter(Logs.Added >= since).scalar() or 0 |
|||
error_logs = db.session.query(func.count(Logs.id)).filter( |
|||
and_(Logs.Added >= since, or_(Logs.Action.like('%ERROR%'), Logs.Action.like('%FAILED%'))) |
|||
).scalar() or 0 |
|||
|
|||
report['summary'] = { |
|||
'total_logs': total_logs, |
|||
'error_logs': error_logs, |
|||
'error_rate': round((error_logs / total_logs * 100) if total_logs > 0 else 0, 2) |
|||
} |
|||
|
|||
return report |
|||
|
|||
def generate_security_report(since: datetime) -> Dict[str, Any]: |
|||
"""Generate security events report.""" |
|||
return { |
|||
'generated_at': datetime.now(timezone.utc).isoformat(), |
|||
'type': 'security', |
|||
'events': parse_security_logs(since) |
|||
} |
|||
|
|||
def generate_performance_report(since: datetime) -> Dict[str, Any]: |
|||
"""Generate performance analysis report.""" |
|||
return { |
|||
'generated_at': datetime.now(timezone.utc).isoformat(), |
|||
'type': 'performance', |
|||
'metrics': parse_performance_logs(since) |
|||
} |
|||
|
|||
def generate_payment_report(since: datetime) -> Dict[str, Any]: |
|||
"""Generate payment processing report.""" |
|||
total_payments = db.session.query(func.count(Payments.id)).filter(Payments.Added >= since).scalar() or 0 |
|||
successful_payments = db.session.query(func.count(Payments.id)).filter( |
|||
and_(Payments.Added >= since, Payments.Success == True) |
|||
).scalar() or 0 |
|||
|
|||
return { |
|||
'generated_at': datetime.now(timezone.utc).isoformat(), |
|||
'type': 'payment', |
|||
'summary': { |
|||
'total_payments': total_payments, |
|||
'successful_payments': successful_payments, |
|||
'success_rate': round((successful_payments / total_payments * 100) if total_payments > 0 else 0, 2) |
|||
} |
|||
} |
|||
@ -0,0 +1,213 @@ |
|||
""" |
|||
Search blueprint for Plutus payment processing application. |
|||
|
|||
Provides unified search functionality across SinglePayments and Payments tables. |
|||
Supports search by Splynx_ID and Payment Intent. |
|||
""" |
|||
|
|||
from flask import Blueprint, render_template, request, jsonify |
|||
from flask_login import login_required |
|||
from sqlalchemy import or_, and_ |
|||
from models import SinglePayments, Payments, Users |
|||
from app import db |
|||
from permissions import helpdesk_required |
|||
import re |
|||
|
|||
search_bp = Blueprint('search', __name__, url_prefix='/search') |
|||
|
|||
@search_bp.route('/') |
|||
@helpdesk_required |
|||
def search_page(): |
|||
"""Display the unified payment search page.""" |
|||
return render_template('search/search.html') |
|||
|
|||
@search_bp.route('/api', methods=['GET']) |
|||
@helpdesk_required |
|||
def search_payments(): |
|||
""" |
|||
API endpoint for searching payments. |
|||
|
|||
Query parameters: |
|||
- q: Search query (Splynx_ID or Payment Intent) |
|||
- type: Search type ('all', 'splynx_id', 'payment_intent') |
|||
- limit: Maximum results to return (default: 50) |
|||
""" |
|||
try: |
|||
query = request.args.get('q', '').strip() |
|||
search_type = request.args.get('type', 'all') |
|||
limit = min(int(request.args.get('limit', 50)), 100) # Max 100 results |
|||
|
|||
if not query: |
|||
return jsonify({ |
|||
'success': False, |
|||
'error': 'Search query is required', |
|||
'results': [] |
|||
}) |
|||
|
|||
# Determine search strategy based on query content |
|||
if search_type == 'all': |
|||
search_type = detect_search_type(query) |
|||
|
|||
# Build search queries for both tables |
|||
single_payments_query = build_single_payments_query(query, search_type, limit) |
|||
payments_query = build_payments_query(query, search_type, limit) |
|||
|
|||
# Execute searches |
|||
single_payments_results = single_payments_query.all() |
|||
payments_results = payments_query.all() |
|||
|
|||
# Format results |
|||
formatted_results = [] |
|||
|
|||
# Add single payments results |
|||
for payment in single_payments_results: |
|||
formatted_results.append(format_single_payment_result(payment)) |
|||
|
|||
# Add batch payments results |
|||
for payment in payments_results: |
|||
formatted_results.append(format_batch_payment_result(payment)) |
|||
|
|||
# Sort by creation date (newest first) |
|||
formatted_results.sort(key=lambda x: x['created'], reverse=True) |
|||
|
|||
# Apply limit to combined results |
|||
formatted_results = formatted_results[:limit] |
|||
|
|||
return jsonify({ |
|||
'success': True, |
|||
'results': formatted_results, |
|||
'total_found': len(formatted_results), |
|||
'search_query': query, |
|||
'search_type': search_type |
|||
}) |
|||
|
|||
except Exception as e: |
|||
return jsonify({ |
|||
'success': False, |
|||
'error': str(e), |
|||
'results': [] |
|||
}), 500 |
|||
|
|||
def detect_search_type(query: str) -> str: |
|||
""" |
|||
Detect the type of search based on query content. |
|||
|
|||
Args: |
|||
query: Search query string |
|||
|
|||
Returns: |
|||
str: 'payment_intent' if looks like PI, 'splynx_id' if numeric, 'all' otherwise |
|||
""" |
|||
# Check if it looks like a payment intent ID |
|||
if re.match(r'^pi_[a-zA-Z0-9]+$', query): |
|||
return 'payment_intent' |
|||
|
|||
# Check if it's purely numeric (likely Splynx ID) |
|||
if query.isdigit(): |
|||
return 'splynx_id' |
|||
|
|||
# Default to searching all fields |
|||
return 'all' |
|||
|
|||
def build_single_payments_query(query: str, search_type: str, limit: int): |
|||
"""Build SQLAlchemy query for SinglePayments table.""" |
|||
base_query = db.session.query(SinglePayments, Users.FullName).outerjoin( |
|||
Users, SinglePayments.Who == Users.id |
|||
) |
|||
|
|||
if search_type == 'splynx_id': |
|||
try: |
|||
splynx_id = int(query) |
|||
return base_query.filter(SinglePayments.Splynx_ID == splynx_id).limit(limit) |
|||
except ValueError: |
|||
return base_query.filter(False) # No results for invalid numeric |
|||
|
|||
elif search_type == 'payment_intent': |
|||
return base_query.filter(SinglePayments.Payment_Intent == query).limit(limit) |
|||
|
|||
else: # search_type == 'all' |
|||
# Try to convert to int for Splynx_ID search |
|||
conditions = [] |
|||
|
|||
if query.isdigit(): |
|||
conditions.append(SinglePayments.Splynx_ID == int(query)) |
|||
|
|||
conditions.extend([ |
|||
SinglePayments.Payment_Intent == query, |
|||
SinglePayments.Stripe_Customer_ID.like(f'%{query}%') |
|||
]) |
|||
|
|||
return base_query.filter(or_(*conditions)).limit(limit) |
|||
|
|||
def build_payments_query(query: str, search_type: str, limit: int): |
|||
"""Build SQLAlchemy query for Payments table.""" |
|||
base_query = db.session.query(Payments) |
|||
|
|||
if search_type == 'splynx_id': |
|||
try: |
|||
splynx_id = int(query) |
|||
return base_query.filter(Payments.Splynx_ID == splynx_id).limit(limit) |
|||
except ValueError: |
|||
return base_query.filter(False) # No results for invalid numeric |
|||
|
|||
elif search_type == 'payment_intent': |
|||
return base_query.filter(Payments.Payment_Intent == query).limit(limit) |
|||
|
|||
else: # search_type == 'all' |
|||
conditions = [] |
|||
|
|||
if query.isdigit(): |
|||
conditions.append(Payments.Splynx_ID == int(query)) |
|||
|
|||
conditions.extend([ |
|||
Payments.Payment_Intent == query, |
|||
Payments.Stripe_Customer_ID.like(f'%{query}%') |
|||
]) |
|||
|
|||
return base_query.filter(or_(*conditions)).limit(limit) |
|||
|
|||
def format_single_payment_result(payment_data) -> dict: |
|||
"""Format a SinglePayment result for API response.""" |
|||
payment, user_name = payment_data |
|||
|
|||
return { |
|||
'id': payment.id, |
|||
'type': 'single', |
|||
'splynx_id': payment.Splynx_ID, |
|||
'stripe_customer_id': payment.Stripe_Customer_ID, |
|||
'payment_intent': payment.Payment_Intent, |
|||
'payment_method': payment.Payment_Method, |
|||
'amount': payment.Payment_Amount, |
|||
'success': payment.Success, |
|||
'error': payment.Error, |
|||
'refund': payment.Refund, |
|||
'refund_followup': payment.Refund_FollowUp, |
|||
'pi_followup': payment.PI_FollowUp, |
|||
'created': payment.Created.isoformat(), |
|||
'processed_by': user_name or 'Unknown', |
|||
'detail_url': f'/single-payment/detail/{payment.id}', |
|||
'splynx_url': f'https://billing.interphone.com.au/admin/customers/view?id={payment.Splynx_ID}' if payment.Splynx_ID else None |
|||
} |
|||
|
|||
def format_batch_payment_result(payment) -> dict: |
|||
"""Format a batch Payment result for API response.""" |
|||
return { |
|||
'id': payment.id, |
|||
'type': 'batch', |
|||
'batch_id': payment.PaymentBatch_ID, |
|||
'splynx_id': payment.Splynx_ID, |
|||
'stripe_customer_id': payment.Stripe_Customer_ID, |
|||
'payment_intent': payment.Payment_Intent, |
|||
'payment_method': payment.Payment_Method, |
|||
'amount': payment.Payment_Amount, |
|||
'success': payment.Success, |
|||
'error': payment.Error, |
|||
'refund': payment.Refund, |
|||
'refund_followup': payment.Refund_FollowUp, |
|||
'pi_followup': payment.PI_FollowUp, |
|||
'created': payment.Created.isoformat(), |
|||
'processed_by': 'Batch Process', |
|||
'detail_url': f'/payment/detail/{payment.id}', |
|||
'batch_url': f'/batch/{payment.PaymentBatch_ID}', |
|||
'splynx_url': f'https://billing.interphone.com.au/admin/customers/view?id={payment.Splynx_ID}' if payment.Splynx_ID else None |
|||
} |
|||
@ -0,0 +1,55 @@ |
|||
import smtplib, ssl |
|||
from email.mime.text import MIMEText |
|||
from email.mime.multipart import MIMEMultipart |
|||
|
|||
port = 587 # For starttls |
|||
smtp_server = "smtp.fibrenet.net.au" |
|||
sender_email = "alan.woodman@interphone.com.au" |
|||
receiver_email = "alan.woodman@interphone.com.au" |
|||
#password = input("Type your password and press enter:") |
|||
|
|||
class SendEmail(): |
|||
def __init__(self) -> None: |
|||
self.smtp_server = "smtp.fibrenet.net.au" |
|||
self.smtp_port = 587 |
|||
self.sender = "Accounts <accounts@interphone.com.au>" |
|||
self.receiver = None |
|||
self.message_type = "html" |
|||
self.subject = None |
|||
self.message_body_html = None |
|||
self.message_body_plain = None |
|||
|
|||
def send(self): |
|||
message = MIMEMultipart("alternative") |
|||
message["Subject"] = self.subject |
|||
message["From"] = self.sender |
|||
message["To"] = self.receiver |
|||
|
|||
# Turn these into plain/html MIMEText objects |
|||
if self.message_type == "plain": |
|||
part1 = MIMEText(self.message_body_plain, "plain") |
|||
message.attach(part1) |
|||
elif self.message_type == "html": |
|||
part2 = MIMEText(self.message_body_html, "html") |
|||
message.attach(part2) |
|||
elif self.message_type == "both": |
|||
part1 = MIMEText(self.message_body_plain, "plain") |
|||
message.attach(part1) |
|||
part2 = MIMEText(self.message_body_html, "html") |
|||
message.attach(part2) |
|||
|
|||
|
|||
# Create secure connection with server and send email |
|||
context = ssl.create_default_context() |
|||
try: |
|||
with smtplib.SMTP(smtp_server, port) as server: |
|||
#server.login(sender_email, password) |
|||
server.ehlo() |
|||
server.starttls(context=context) |
|||
server.ehlo() # Can be omitted |
|||
server.sendmail( |
|||
self.sender, self.receiver, message.as_string() |
|||
) |
|||
return True |
|||
except: |
|||
return False |
|||
@ -0,0 +1,400 @@ |
|||
""" |
|||
Log retention and cleanup system for Plutus Payment Processing. |
|||
|
|||
This module provides automated log cleanup, archiving, and retention policies |
|||
to manage log file growth and maintain system performance. |
|||
""" |
|||
|
|||
import os |
|||
import glob |
|||
import gzip |
|||
import shutil |
|||
import logging |
|||
from datetime import datetime, timedelta |
|||
from typing import List, Dict, Optional |
|||
from pathlib import Path |
|||
import schedule |
|||
import time |
|||
import threading |
|||
from logging_config import get_logger |
|||
|
|||
logger = get_logger('log_retention') |
|||
|
|||
class LogRetentionManager: |
|||
"""Manages log file retention, rotation, and cleanup.""" |
|||
|
|||
def __init__(self, config: Optional[Dict] = None): |
|||
""" |
|||
Initialize log retention manager. |
|||
|
|||
Args: |
|||
config: Configuration dictionary with retention policies |
|||
""" |
|||
self.config = config or self.get_default_config() |
|||
self.logs_dir = Path(self.config.get('logs_directory', 'logs')) |
|||
self.archive_dir = Path(self.config.get('archive_directory', 'logs/archive')) |
|||
|
|||
# Ensure directories exist |
|||
self.logs_dir.mkdir(exist_ok=True) |
|||
self.archive_dir.mkdir(parents=True, exist_ok=True) |
|||
|
|||
# Scheduler for automated cleanup |
|||
self._scheduler_thread = None |
|||
self._stop_scheduler = False |
|||
|
|||
def get_default_config(self) -> Dict: |
|||
"""Get default retention configuration.""" |
|||
return { |
|||
'logs_directory': 'logs', |
|||
'archive_directory': 'logs/archive', |
|||
'retention_policies': { |
|||
'application.log': {'days': 30, 'compress_after_days': 7}, |
|||
'performance.log': {'days': 14, 'compress_after_days': 3}, |
|||
'security.log': {'days': 90, 'compress_after_days': 7}, |
|||
'plutus_detailed.log': {'days': 21, 'compress_after_days': 7}, |
|||
'payment_processing.log': {'days': 60, 'compress_after_days': 14}, |
|||
'default': {'days': 30, 'compress_after_days': 7} |
|||
}, |
|||
'max_file_size_mb': 100, |
|||
'cleanup_schedule': '02:00', # Run at 2 AM daily |
|||
'archive_old_logs': True, |
|||
'compress_archives': True |
|||
} |
|||
|
|||
def cleanup_logs(self) -> Dict[str, int]: |
|||
""" |
|||
Perform log cleanup based on retention policies. |
|||
|
|||
Returns: |
|||
Dict with cleanup statistics |
|||
""" |
|||
stats = { |
|||
'files_compressed': 0, |
|||
'files_archived': 0, |
|||
'files_deleted': 0, |
|||
'space_freed_mb': 0 |
|||
} |
|||
|
|||
try: |
|||
logger.info("Starting log cleanup process") |
|||
|
|||
# Get all log files |
|||
log_files = self.get_log_files() |
|||
|
|||
for log_file in log_files: |
|||
try: |
|||
policy = self.get_retention_policy(log_file.name) |
|||
file_stats = datetime.fromtimestamp(log_file.stat().st_mtime) |
|||
file_age = (datetime.now() - file_stats).days |
|||
file_size_mb = log_file.stat().st_size / (1024 * 1024) |
|||
|
|||
# Check if file should be deleted |
|||
if file_age > policy['days']: |
|||
if self.config.get('archive_old_logs', True): |
|||
# Archive before deletion |
|||
if self.archive_log_file(log_file): |
|||
stats['files_archived'] += 1 |
|||
|
|||
stats['space_freed_mb'] += file_size_mb |
|||
log_file.unlink() |
|||
stats['files_deleted'] += 1 |
|||
logger.info(f"Deleted old log file: {log_file.name} (age: {file_age} days)") |
|||
|
|||
# Check if file should be compressed |
|||
elif file_age > policy['compress_after_days'] and not log_file.name.endswith('.gz'): |
|||
if self.compress_log_file(log_file): |
|||
stats['files_compressed'] += 1 |
|||
logger.info(f"Compressed log file: {log_file.name}") |
|||
|
|||
# Check file size limits |
|||
elif file_size_mb > self.config.get('max_file_size_mb', 100): |
|||
if self.rotate_large_log_file(log_file): |
|||
logger.info(f"Rotated large log file: {log_file.name} ({file_size_mb:.1f}MB)") |
|||
|
|||
except Exception as e: |
|||
logger.error(f"Error processing log file {log_file.name}: {e}") |
|||
|
|||
logger.info(f"Log cleanup completed: {stats}") |
|||
return stats |
|||
|
|||
except Exception as e: |
|||
logger.error(f"Error during log cleanup: {e}") |
|||
return stats |
|||
|
|||
def get_log_files(self) -> List[Path]: |
|||
"""Get all log files in the logs directory.""" |
|||
log_patterns = ['*.log', '*.log.*'] |
|||
log_files = [] |
|||
|
|||
for pattern in log_patterns: |
|||
log_files.extend(self.logs_dir.glob(pattern)) |
|||
|
|||
return log_files |
|||
|
|||
def get_retention_policy(self, filename: str) -> Dict[str, int]: |
|||
"""Get retention policy for a specific log file.""" |
|||
policies = self.config.get('retention_policies', {}) |
|||
|
|||
# Check for exact filename match |
|||
if filename in policies: |
|||
return policies[filename] |
|||
|
|||
# Check for pattern matches |
|||
for pattern, policy in policies.items(): |
|||
if pattern in filename: |
|||
return policy |
|||
|
|||
# Return default policy |
|||
return policies.get('default', {'days': 30, 'compress_after_days': 7}) |
|||
|
|||
def compress_log_file(self, log_file: Path) -> bool: |
|||
""" |
|||
Compress a log file using gzip. |
|||
|
|||
Args: |
|||
log_file: Path to the log file to compress |
|||
|
|||
Returns: |
|||
True if compression was successful |
|||
""" |
|||
try: |
|||
compressed_file = log_file.with_suffix(log_file.suffix + '.gz') |
|||
|
|||
with open(log_file, 'rb') as f_in: |
|||
with gzip.open(compressed_file, 'wb') as f_out: |
|||
shutil.copyfileobj(f_in, f_out) |
|||
|
|||
# Remove original file after successful compression |
|||
log_file.unlink() |
|||
return True |
|||
|
|||
except Exception as e: |
|||
logger.error(f"Failed to compress {log_file.name}: {e}") |
|||
return False |
|||
|
|||
def archive_log_file(self, log_file: Path) -> bool: |
|||
""" |
|||
Archive a log file to the archive directory. |
|||
|
|||
Args: |
|||
log_file: Path to the log file to archive |
|||
|
|||
Returns: |
|||
True if archiving was successful |
|||
""" |
|||
try: |
|||
# Create dated archive subdirectory |
|||
archive_date = datetime.now().strftime('%Y%m') |
|||
archive_subdir = self.archive_dir / archive_date |
|||
archive_subdir.mkdir(exist_ok=True) |
|||
|
|||
# Generate archive filename with timestamp |
|||
timestamp = datetime.now().strftime('%Y%m%d_%H%M%S') |
|||
archive_name = f"{log_file.stem}_{timestamp}{log_file.suffix}" |
|||
archive_path = archive_subdir / archive_name |
|||
|
|||
# Copy and optionally compress |
|||
if self.config.get('compress_archives', True): |
|||
archive_path = archive_path.with_suffix(archive_path.suffix + '.gz') |
|||
with open(log_file, 'rb') as f_in: |
|||
with gzip.open(archive_path, 'wb') as f_out: |
|||
shutil.copyfileobj(f_in, f_out) |
|||
else: |
|||
shutil.copy2(log_file, archive_path) |
|||
|
|||
return True |
|||
|
|||
except Exception as e: |
|||
logger.error(f"Failed to archive {log_file.name}: {e}") |
|||
return False |
|||
|
|||
def rotate_large_log_file(self, log_file: Path) -> bool: |
|||
""" |
|||
Rotate a log file that has grown too large. |
|||
|
|||
Args: |
|||
log_file: Path to the log file to rotate |
|||
|
|||
Returns: |
|||
True if rotation was successful |
|||
""" |
|||
try: |
|||
timestamp = datetime.now().strftime('%Y%m%d_%H%M%S') |
|||
rotated_name = f"{log_file.stem}_{timestamp}{log_file.suffix}" |
|||
rotated_path = log_file.parent / rotated_name |
|||
|
|||
# Move current log to rotated name |
|||
shutil.move(str(log_file), str(rotated_path)) |
|||
|
|||
# Compress the rotated file |
|||
if self.compress_log_file(rotated_path): |
|||
logger.info(f"Rotated and compressed large log file: {log_file.name}") |
|||
|
|||
return True |
|||
|
|||
except Exception as e: |
|||
logger.error(f"Failed to rotate large log file {log_file.name}: {e}") |
|||
return False |
|||
|
|||
def get_log_statistics(self) -> Dict: |
|||
"""Get statistics about log files and disk usage.""" |
|||
stats = { |
|||
'total_files': 0, |
|||
'total_size_mb': 0, |
|||
'compressed_files': 0, |
|||
'oldest_log': None, |
|||
'newest_log': None, |
|||
'logs_by_type': {}, |
|||
'archive_stats': { |
|||
'total_files': 0, |
|||
'total_size_mb': 0 |
|||
} |
|||
} |
|||
|
|||
try: |
|||
log_files = self.get_log_files() |
|||
oldest_time = None |
|||
newest_time = None |
|||
|
|||
for log_file in log_files: |
|||
file_stat = log_file.stat() |
|||
file_size_mb = file_stat.st_size / (1024 * 1024) |
|||
file_time = datetime.fromtimestamp(file_stat.st_mtime) |
|||
|
|||
stats['total_files'] += 1 |
|||
stats['total_size_mb'] += file_size_mb |
|||
|
|||
if log_file.name.endswith('.gz'): |
|||
stats['compressed_files'] += 1 |
|||
|
|||
# Track oldest and newest |
|||
if oldest_time is None or file_time < oldest_time: |
|||
oldest_time = file_time |
|||
stats['oldest_log'] = {'name': log_file.name, 'date': file_time.isoformat()} |
|||
|
|||
if newest_time is None or file_time > newest_time: |
|||
newest_time = file_time |
|||
stats['newest_log'] = {'name': log_file.name, 'date': file_time.isoformat()} |
|||
|
|||
# Count by log type |
|||
log_type = log_file.stem.split('_')[0] if '_' in log_file.stem else log_file.stem |
|||
if log_type not in stats['logs_by_type']: |
|||
stats['logs_by_type'][log_type] = {'count': 0, 'size_mb': 0} |
|||
stats['logs_by_type'][log_type]['count'] += 1 |
|||
stats['logs_by_type'][log_type]['size_mb'] += file_size_mb |
|||
|
|||
# Get archive statistics |
|||
if self.archive_dir.exists(): |
|||
archive_files = list(self.archive_dir.rglob('*')) |
|||
for archive_file in archive_files: |
|||
if archive_file.is_file(): |
|||
stats['archive_stats']['total_files'] += 1 |
|||
stats['archive_stats']['total_size_mb'] += archive_file.stat().st_size / (1024 * 1024) |
|||
|
|||
# Round sizes |
|||
stats['total_size_mb'] = round(stats['total_size_mb'], 2) |
|||
stats['archive_stats']['total_size_mb'] = round(stats['archive_stats']['total_size_mb'], 2) |
|||
|
|||
for log_type in stats['logs_by_type']: |
|||
stats['logs_by_type'][log_type]['size_mb'] = round(stats['logs_by_type'][log_type]['size_mb'], 2) |
|||
|
|||
except Exception as e: |
|||
logger.error(f"Error getting log statistics: {e}") |
|||
|
|||
return stats |
|||
|
|||
def start_scheduled_cleanup(self): |
|||
"""Start the scheduled cleanup service.""" |
|||
if self._scheduler_thread and self._scheduler_thread.is_alive(): |
|||
logger.warning("Scheduled cleanup is already running") |
|||
return |
|||
|
|||
# Schedule daily cleanup |
|||
schedule.clear() |
|||
cleanup_time = self.config.get('cleanup_schedule', '02:00') |
|||
schedule.every().day.at(cleanup_time).do(self.cleanup_logs) |
|||
|
|||
logger.info(f"Scheduled daily log cleanup at {cleanup_time}") |
|||
|
|||
def run_scheduler(): |
|||
while not self._stop_scheduler: |
|||
schedule.run_pending() |
|||
time.sleep(60) # Check every minute |
|||
|
|||
self._stop_scheduler = False |
|||
self._scheduler_thread = threading.Thread(target=run_scheduler, daemon=True) |
|||
self._scheduler_thread.start() |
|||
|
|||
def stop_scheduled_cleanup(self): |
|||
"""Stop the scheduled cleanup service.""" |
|||
self._stop_scheduler = True |
|||
if self._scheduler_thread: |
|||
self._scheduler_thread.join(timeout=5) |
|||
schedule.clear() |
|||
logger.info("Stopped scheduled log cleanup") |
|||
|
|||
def emergency_cleanup(self, target_size_mb: int = 500) -> Dict: |
|||
""" |
|||
Perform emergency cleanup when disk space is low. |
|||
|
|||
Args: |
|||
target_size_mb: Target total size for log files in MB |
|||
|
|||
Returns: |
|||
Dict with cleanup statistics |
|||
""" |
|||
logger.warning(f"Starting emergency log cleanup to reduce size to {target_size_mb}MB") |
|||
|
|||
stats = {'files_deleted': 0, 'space_freed_mb': 0} |
|||
|
|||
# Get all log files sorted by age (oldest first) |
|||
log_files = self.get_log_files() |
|||
log_files.sort(key=lambda x: x.stat().st_mtime) |
|||
|
|||
current_size_mb = sum(f.stat().st_size for f in log_files) / (1024 * 1024) |
|||
|
|||
for log_file in log_files: |
|||
if current_size_mb <= target_size_mb: |
|||
break |
|||
|
|||
file_size_mb = log_file.stat().st_size / (1024 * 1024) |
|||
|
|||
# Archive critical logs before deletion |
|||
if any(pattern in log_file.name for pattern in ['security', 'payment_processing']): |
|||
self.archive_log_file(log_file) |
|||
|
|||
log_file.unlink() |
|||
stats['files_deleted'] += 1 |
|||
stats['space_freed_mb'] += file_size_mb |
|||
current_size_mb -= file_size_mb |
|||
|
|||
logger.info(f"Emergency cleanup: deleted {log_file.name} ({file_size_mb:.1f}MB)") |
|||
|
|||
logger.warning(f"Emergency cleanup completed: {stats}") |
|||
return stats |
|||
|
|||
|
|||
# Global retention manager instance |
|||
retention_manager = LogRetentionManager() |
|||
|
|||
def initialize_log_retention(): |
|||
"""Initialize the log retention system.""" |
|||
try: |
|||
retention_manager.start_scheduled_cleanup() |
|||
logger.info("Log retention system initialized successfully") |
|||
except Exception as e: |
|||
logger.error(f"Failed to initialize log retention system: {e}") |
|||
|
|||
def get_retention_stats(): |
|||
"""Get current log retention statistics.""" |
|||
return retention_manager.get_log_statistics() |
|||
|
|||
def manual_cleanup(): |
|||
"""Perform manual log cleanup.""" |
|||
return retention_manager.cleanup_logs() |
|||
|
|||
if __name__ == "__main__": |
|||
# Run log cleanup manually |
|||
manager = LogRetentionManager() |
|||
stats = manager.cleanup_logs() |
|||
print(f"Log cleanup completed: {stats}") |
|||
@ -0,0 +1,318 @@ |
|||
""" |
|||
Enhanced logging configuration for Plutus Payment Processing System. |
|||
|
|||
This module provides structured logging with correlation IDs, performance monitoring, |
|||
security event tracking, and centralized log management. |
|||
""" |
|||
|
|||
import logging |
|||
import logging.handlers |
|||
import json |
|||
import time |
|||
import uuid |
|||
from datetime import datetime, timezone |
|||
from typing import Dict, Any, Optional, Union |
|||
from contextlib import contextmanager |
|||
from functools import wraps |
|||
import threading |
|||
import os |
|||
|
|||
# Thread-local storage for correlation IDs |
|||
_thread_local = threading.local() |
|||
|
|||
class CorrelatedFormatter(logging.Formatter): |
|||
"""Custom formatter that adds correlation ID and structured data to logs.""" |
|||
|
|||
def format(self, record): |
|||
# Add correlation ID to log record |
|||
if hasattr(_thread_local, 'correlation_id'): |
|||
record.correlation_id = _thread_local.correlation_id |
|||
else: |
|||
record.correlation_id = 'no-correlation' |
|||
|
|||
# Add structured data if present |
|||
if hasattr(record, 'structured_data'): |
|||
record.structured_data_str = json.dumps(record.structured_data, default=str) |
|||
else: |
|||
record.structured_data_str = '' |
|||
|
|||
return super().format(record) |
|||
|
|||
class StructuredLogger: |
|||
"""Enhanced logger with structured logging capabilities.""" |
|||
|
|||
def __init__(self, name: str): |
|||
self.logger = logging.getLogger(name) |
|||
self._setup_logger() |
|||
|
|||
def _setup_logger(self): |
|||
"""Configure the logger with enhanced formatting.""" |
|||
if not self.logger.handlers: # Avoid duplicate handlers |
|||
# Create logs directory if it doesn't exist |
|||
os.makedirs('logs', exist_ok=True) |
|||
|
|||
# Console handler with structured formatting |
|||
console_handler = logging.StreamHandler() |
|||
console_formatter = CorrelatedFormatter( |
|||
'%(asctime)s - [%(correlation_id)s] - %(name)s - %(levelname)s - %(message)s %(structured_data_str)s' |
|||
) |
|||
console_handler.setFormatter(console_formatter) |
|||
|
|||
# File handler with detailed formatting |
|||
file_handler = logging.handlers.RotatingFileHandler( |
|||
'logs/plutus_detailed.log', |
|||
maxBytes=10*1024*1024, # 10MB |
|||
backupCount=5 |
|||
) |
|||
file_formatter = CorrelatedFormatter( |
|||
'%(asctime)s - [%(correlation_id)s] - %(name)s - %(levelname)s - %(funcName)s:%(lineno)d - %(message)s %(structured_data_str)s' |
|||
) |
|||
file_handler.setFormatter(file_formatter) |
|||
|
|||
self.logger.addHandler(console_handler) |
|||
self.logger.addHandler(file_handler) |
|||
self.logger.setLevel(logging.INFO) |
|||
|
|||
def info(self, message: str, **kwargs): |
|||
"""Log info message with optional structured data.""" |
|||
extra = {'structured_data': kwargs} if kwargs else {} |
|||
self.logger.info(message, extra=extra) |
|||
|
|||
def error(self, message: str, **kwargs): |
|||
"""Log error message with optional structured data.""" |
|||
extra = {'structured_data': kwargs} if kwargs else {} |
|||
self.logger.error(message, extra=extra) |
|||
|
|||
def warning(self, message: str, **kwargs): |
|||
"""Log warning message with optional structured data.""" |
|||
extra = {'structured_data': kwargs} if kwargs else {} |
|||
self.logger.warning(message, extra=extra) |
|||
|
|||
def debug(self, message: str, **kwargs): |
|||
"""Log debug message with optional structured data.""" |
|||
extra = {'structured_data': kwargs} if kwargs else {} |
|||
self.logger.debug(message, extra=extra) |
|||
|
|||
def critical(self, message: str, **kwargs): |
|||
"""Log critical message with optional structured data.""" |
|||
extra = {'structured_data': kwargs} if kwargs else {} |
|||
self.logger.critical(message, extra=extra) |
|||
|
|||
class SecurityLogger: |
|||
"""Specialized logger for security events.""" |
|||
|
|||
def __init__(self): |
|||
self.logger = StructuredLogger('security') |
|||
|
|||
# Ensure logs directory exists |
|||
os.makedirs('logs', exist_ok=True) |
|||
|
|||
# Additional security log file |
|||
security_handler = logging.handlers.RotatingFileHandler( |
|||
'logs/security.log', |
|||
maxBytes=5*1024*1024, # 5MB |
|||
backupCount=10 |
|||
) |
|||
security_formatter = CorrelatedFormatter( |
|||
'%(asctime)s - SECURITY - [%(correlation_id)s] - %(levelname)s - %(message)s %(structured_data_str)s' |
|||
) |
|||
security_handler.setFormatter(security_formatter) |
|||
self.logger.logger.addHandler(security_handler) |
|||
|
|||
def log_login_attempt(self, username: str, success: bool, ip_address: str, user_agent: str = None): |
|||
"""Log login attempts.""" |
|||
event_type = "LOGIN_SUCCESS" if success else "LOGIN_FAILED" |
|||
self.logger.info( |
|||
f"{event_type} for user: {username}", |
|||
event_type=event_type, |
|||
username=username, |
|||
ip_address=ip_address, |
|||
user_agent=user_agent, |
|||
timestamp=datetime.now(timezone.utc).isoformat() |
|||
) |
|||
|
|||
def log_permission_denied(self, username: str, action: str, resource: str, ip_address: str): |
|||
"""Log permission denied events.""" |
|||
self.logger.warning( |
|||
f"PERMISSION_DENIED: User {username} attempted {action} on {resource}", |
|||
event_type="PERMISSION_DENIED", |
|||
username=username, |
|||
action=action, |
|||
resource=resource, |
|||
ip_address=ip_address, |
|||
timestamp=datetime.now(timezone.utc).isoformat() |
|||
) |
|||
|
|||
def log_payment_fraud_alert(self, payment_id: int, customer_id: str, reason: str, amount: float): |
|||
"""Log potential fraud alerts.""" |
|||
self.logger.critical( |
|||
f"FRAUD_ALERT: Payment {payment_id} for customer {customer_id}", |
|||
event_type="FRAUD_ALERT", |
|||
payment_id=payment_id, |
|||
customer_id=customer_id, |
|||
reason=reason, |
|||
amount=amount, |
|||
timestamp=datetime.now(timezone.utc).isoformat() |
|||
) |
|||
|
|||
class PerformanceLogger: |
|||
"""Performance monitoring logger.""" |
|||
|
|||
def __init__(self): |
|||
self.logger = StructuredLogger('performance') |
|||
|
|||
# Ensure logs directory exists |
|||
os.makedirs('logs', exist_ok=True) |
|||
|
|||
# Additional performance log file |
|||
perf_handler = logging.handlers.RotatingFileHandler( |
|||
'logs/performance.log', |
|||
maxBytes=10*1024*1024, # 10MB |
|||
backupCount=5 |
|||
) |
|||
perf_formatter = CorrelatedFormatter( |
|||
'%(asctime)s - PERF - [%(correlation_id)s] - %(message)s %(structured_data_str)s' |
|||
) |
|||
perf_handler.setFormatter(perf_formatter) |
|||
self.logger.logger.addHandler(perf_handler) |
|||
|
|||
def log_request_time(self, endpoint: str, method: str, duration_ms: float, status_code: int, user_id: int = None): |
|||
"""Log HTTP request performance.""" |
|||
self.logger.info( |
|||
f"REQUEST: {method} {endpoint} - {duration_ms:.2f}ms - {status_code}", |
|||
endpoint=endpoint, |
|||
method=method, |
|||
duration_ms=duration_ms, |
|||
status_code=status_code, |
|||
user_id=user_id, |
|||
timestamp=datetime.now(timezone.utc).isoformat() |
|||
) |
|||
|
|||
def log_database_query(self, query_type: str, table: str, duration_ms: float, row_count: int = None): |
|||
"""Log database query performance.""" |
|||
self.logger.info( |
|||
f"DB_QUERY: {query_type} on {table} - {duration_ms:.2f}ms", |
|||
query_type=query_type, |
|||
table=table, |
|||
duration_ms=duration_ms, |
|||
row_count=row_count, |
|||
timestamp=datetime.now(timezone.utc).isoformat() |
|||
) |
|||
|
|||
def log_stripe_api_call(self, operation: str, duration_ms: float, success: bool, error_code: str = None): |
|||
"""Log Stripe API call performance.""" |
|||
status = "SUCCESS" if success else f"FAILED ({error_code})" |
|||
self.logger.info( |
|||
f"STRIPE_API: {operation} - {duration_ms:.2f}ms - {status}", |
|||
operation=operation, |
|||
duration_ms=duration_ms, |
|||
success=success, |
|||
error_code=error_code, |
|||
timestamp=datetime.now(timezone.utc).isoformat() |
|||
) |
|||
|
|||
# Global logger instances |
|||
app_logger = StructuredLogger('plutus.app') |
|||
security_logger = SecurityLogger() |
|||
performance_logger = PerformanceLogger() |
|||
|
|||
def get_logger(name: str) -> StructuredLogger: |
|||
"""Get a structured logger instance.""" |
|||
return StructuredLogger(name) |
|||
|
|||
def set_correlation_id(correlation_id: str = None) -> str: |
|||
"""Set correlation ID for current thread.""" |
|||
if correlation_id is None: |
|||
correlation_id = str(uuid.uuid4())[:8] |
|||
_thread_local.correlation_id = correlation_id |
|||
return correlation_id |
|||
|
|||
def get_correlation_id() -> str: |
|||
"""Get current correlation ID.""" |
|||
return getattr(_thread_local, 'correlation_id', 'no-correlation') |
|||
|
|||
@contextmanager |
|||
def log_context(correlation_id: str = None): |
|||
"""Context manager for setting correlation ID.""" |
|||
old_id = get_correlation_id() |
|||
new_id = set_correlation_id(correlation_id) |
|||
try: |
|||
yield new_id |
|||
finally: |
|||
_thread_local.correlation_id = old_id |
|||
|
|||
def log_performance(operation_name: str = None): |
|||
"""Decorator to log function performance.""" |
|||
def decorator(func): |
|||
@wraps(func) |
|||
def wrapper(*args, **kwargs): |
|||
name = operation_name or f"{func.__module__}.{func.__name__}" |
|||
start_time = time.time() |
|||
|
|||
try: |
|||
result = func(*args, **kwargs) |
|||
duration_ms = (time.time() - start_time) * 1000 |
|||
performance_logger.logger.info( |
|||
f"OPERATION: {name} completed in {duration_ms:.2f}ms", |
|||
operation=name, |
|||
duration_ms=duration_ms, |
|||
success=True |
|||
) |
|||
return result |
|||
except Exception as e: |
|||
duration_ms = (time.time() - start_time) * 1000 |
|||
performance_logger.logger.error( |
|||
f"OPERATION: {name} failed after {duration_ms:.2f}ms - {str(e)}", |
|||
operation=name, |
|||
duration_ms=duration_ms, |
|||
success=False, |
|||
error=str(e), |
|||
exception_type=type(e).__name__ |
|||
) |
|||
raise |
|||
return wrapper |
|||
return decorator |
|||
|
|||
def setup_flask_logging(app): |
|||
"""Setup Flask application logging.""" |
|||
# Configure Flask's default logger |
|||
app.logger.handlers.clear() |
|||
|
|||
# Add structured logging to Flask |
|||
flask_logger = get_logger('plutus.flask') |
|||
|
|||
# Override Flask's logger |
|||
class FlaskLogHandler(logging.Handler): |
|||
def emit(self, record): |
|||
flask_logger.logger.handle(record) |
|||
|
|||
handler = FlaskLogHandler() |
|||
app.logger.addHandler(handler) |
|||
app.logger.setLevel(logging.INFO) |
|||
|
|||
# Log retention configuration |
|||
LOG_RETENTION_DAYS = 30 |
|||
LOG_CLEANUP_SCHEDULE = "0 2 * * *" # Daily at 2 AM |
|||
|
|||
def cleanup_old_logs(): |
|||
"""Clean up old log files based on retention policy.""" |
|||
import glob |
|||
import os |
|||
from datetime import datetime, timedelta |
|||
|
|||
cutoff_date = datetime.now() - timedelta(days=LOG_RETENTION_DAYS) |
|||
|
|||
log_patterns = [ |
|||
'logs/*.log', |
|||
'logs/*.log.*' |
|||
] |
|||
|
|||
for pattern in log_patterns: |
|||
for log_file in glob.glob(pattern): |
|||
try: |
|||
file_time = datetime.fromtimestamp(os.path.getctime(log_file)) |
|||
if file_time < cutoff_date: |
|||
os.remove(log_file) |
|||
app_logger.info(f"Cleaned up old log file: {log_file}") |
|||
except Exception as e: |
|||
app_logger.error(f"Failed to clean up log file {log_file}: {e}") |
|||
@ -0,0 +1,361 @@ |
|||
""" |
|||
Flask middleware for request/response logging and monitoring. |
|||
|
|||
This module provides middleware to automatically log HTTP requests, responses, |
|||
performance metrics, and security events. |
|||
""" |
|||
|
|||
import time |
|||
import json |
|||
from datetime import datetime, timezone |
|||
from flask import request, g, current_app |
|||
from flask_login import current_user |
|||
from logging_config import ( |
|||
performance_logger, security_logger, app_logger, |
|||
set_correlation_id, get_correlation_id, log_context |
|||
) |
|||
|
|||
class RequestLoggingMiddleware: |
|||
"""Middleware for comprehensive request/response logging.""" |
|||
|
|||
def __init__(self, app=None): |
|||
self.app = app |
|||
if app is not None: |
|||
self.init_app(app) |
|||
|
|||
def init_app(self, app): |
|||
"""Initialize the middleware with Flask app.""" |
|||
app.before_request(self.before_request) |
|||
app.after_request(self.after_request) |
|||
app.teardown_appcontext(self.teardown_request) |
|||
|
|||
def before_request(self): |
|||
"""Called before each request.""" |
|||
# Skip logging for static files |
|||
if request.endpoint == 'static' or request.path.startswith('/static/'): |
|||
return |
|||
|
|||
# Set correlation ID for request tracking |
|||
correlation_id = request.headers.get('X-Correlation-ID') or set_correlation_id() |
|||
g.correlation_id = correlation_id |
|||
g.start_time = time.time() |
|||
|
|||
# Get client information |
|||
g.client_ip = self.get_client_ip() |
|||
g.user_agent = request.headers.get('User-Agent', 'Unknown') |
|||
|
|||
# Log request details |
|||
self.log_request_start() |
|||
|
|||
# Security monitoring for sensitive endpoints |
|||
self.check_security_events() |
|||
|
|||
def after_request(self, response): |
|||
"""Called after each request.""" |
|||
# Skip logging for static files |
|||
if request.endpoint == 'static' or request.path.startswith('/static/'): |
|||
return response |
|||
|
|||
if hasattr(g, 'start_time'): |
|||
duration_ms = (time.time() - g.start_time) * 1000 |
|||
|
|||
# Log response details |
|||
self.log_request_complete(response, duration_ms) |
|||
|
|||
# Log performance metrics |
|||
self.log_performance_metrics(response, duration_ms) |
|||
|
|||
# Add correlation ID to response headers |
|||
response.headers['X-Correlation-ID'] = get_correlation_id() |
|||
|
|||
return response |
|||
|
|||
def teardown_request(self, exception=None): |
|||
"""Called when request context is torn down.""" |
|||
try: |
|||
# Skip logging for static files |
|||
if request.endpoint == 'static' or request.path.startswith('/static/'): |
|||
return |
|||
|
|||
if exception: |
|||
app_logger.error( |
|||
f"Request failed with exception: {str(exception)}", |
|||
endpoint=request.endpoint, |
|||
method=request.method, |
|||
path=request.path, |
|||
exception_type=type(exception).__name__, |
|||
user_id=self.get_user_id() |
|||
) |
|||
except RuntimeError: |
|||
# Request context is no longer available, skip logging |
|||
pass |
|||
except Exception as e: |
|||
# Don't let logging errors break the teardown |
|||
pass |
|||
|
|||
def get_client_ip(self): |
|||
"""Get the real client IP address.""" |
|||
# Check for forwarded headers (reverse proxy) |
|||
if request.headers.get('X-Forwarded-For'): |
|||
return request.headers.get('X-Forwarded-For').split(',')[0].strip() |
|||
elif request.headers.get('X-Real-IP'): |
|||
return request.headers.get('X-Real-IP') |
|||
else: |
|||
return request.environ.get('REMOTE_ADDR', 'Unknown') |
|||
|
|||
def get_user_id(self): |
|||
"""Get current user ID if authenticated.""" |
|||
try: |
|||
if hasattr(current_user, 'is_authenticated') and current_user.is_authenticated: |
|||
return current_user.id |
|||
return None |
|||
except: |
|||
return None |
|||
|
|||
def log_request_start(self): |
|||
"""Log the start of a request.""" |
|||
app_logger.info( |
|||
f"REQUEST_START: {request.method} {request.path}", |
|||
method=request.method, |
|||
path=request.path, |
|||
endpoint=request.endpoint, |
|||
client_ip=g.client_ip, |
|||
user_agent=g.user_agent, |
|||
user_id=self.get_user_id(), |
|||
query_params=dict(request.args) if request.args else None |
|||
) |
|||
|
|||
def log_request_complete(self, response, duration_ms): |
|||
"""Log the completion of a request.""" |
|||
app_logger.info( |
|||
f"REQUEST_COMPLETE: {request.method} {request.path} - {response.status_code} - {duration_ms:.2f}ms", |
|||
method=request.method, |
|||
path=request.path, |
|||
endpoint=request.endpoint, |
|||
status_code=response.status_code, |
|||
duration_ms=duration_ms, |
|||
response_size=len(response.get_data()) if hasattr(response, 'get_data') else None, |
|||
client_ip=g.client_ip, |
|||
user_id=self.get_user_id() |
|||
) |
|||
|
|||
def log_performance_metrics(self, response, duration_ms): |
|||
"""Log performance metrics for the request.""" |
|||
# Log slow requests (> 1 second) |
|||
if duration_ms > 1000: |
|||
performance_logger.logger.warning( |
|||
f"SLOW_REQUEST: {request.method} {request.path} took {duration_ms:.2f}ms", |
|||
method=request.method, |
|||
path=request.path, |
|||
endpoint=request.endpoint, |
|||
duration_ms=duration_ms, |
|||
status_code=response.status_code, |
|||
user_id=self.get_user_id() |
|||
) |
|||
|
|||
# Log to performance logger |
|||
performance_logger.log_request_time( |
|||
endpoint=request.endpoint or request.path, |
|||
method=request.method, |
|||
duration_ms=duration_ms, |
|||
status_code=response.status_code, |
|||
user_id=self.get_user_id() |
|||
) |
|||
|
|||
def check_security_events(self): |
|||
"""Check for potential security events.""" |
|||
# Monitor for suspicious patterns |
|||
suspicious_patterns = [ |
|||
'admin', 'login', 'auth', 'password', 'token', |
|||
'api', 'delete', 'drop', 'truncate', 'insert', |
|||
'update', 'select', 'union', 'script', 'alert' |
|||
] |
|||
|
|||
path_lower = request.path.lower() |
|||
query_string = request.query_string.decode('utf-8', errors='ignore').lower() |
|||
|
|||
# Check for SQL injection attempts |
|||
sql_patterns = ['union', 'select', 'insert', 'delete', 'drop', 'truncate', '--', ';'] |
|||
if any(pattern in query_string for pattern in sql_patterns): |
|||
security_logger.logger.warning( |
|||
f"POTENTIAL_SQL_INJECTION: Suspicious query string detected", |
|||
path=request.path, |
|||
query_string=request.query_string.decode('utf-8', errors='ignore'), |
|||
client_ip=g.client_ip, |
|||
user_agent=g.user_agent, |
|||
user_id=self.get_user_id() |
|||
) |
|||
|
|||
# Check for XSS attempts |
|||
xss_patterns = ['<script', 'javascript:', 'onerror=', 'onload=', 'alert('] |
|||
if any(pattern in query_string for pattern in xss_patterns): |
|||
security_logger.logger.warning( |
|||
f"POTENTIAL_XSS: Suspicious script content detected", |
|||
path=request.path, |
|||
query_string=request.query_string.decode('utf-8', errors='ignore'), |
|||
client_ip=g.client_ip, |
|||
user_agent=g.user_agent, |
|||
user_id=self.get_user_id() |
|||
) |
|||
|
|||
# Monitor failed authentication attempts |
|||
if request.endpoint in ['auth.login', 'auth.register'] and request.method == 'POST': |
|||
g.is_auth_attempt = True |
|||
|
|||
# Check for brute force attempts (multiple requests from same IP) |
|||
if hasattr(g, 'is_auth_attempt'): |
|||
self.check_brute_force_attempt() |
|||
|
|||
def check_brute_force_attempt(self): |
|||
"""Check for potential brute force attacks.""" |
|||
# This would typically use Redis or database to track attempts |
|||
# For now, just log the attempt |
|||
security_logger.logger.info( |
|||
f"AUTH_ATTEMPT: Authentication attempt from {g.client_ip}", |
|||
client_ip=g.client_ip, |
|||
user_agent=g.user_agent, |
|||
endpoint=request.endpoint, |
|||
timestamp=datetime.now(timezone.utc).isoformat() |
|||
) |
|||
|
|||
class DatabaseLoggingMiddleware: |
|||
"""Middleware for database query logging.""" |
|||
|
|||
def __init__(self, app=None): |
|||
self.app = app |
|||
if app is not None: |
|||
self.init_app(app) |
|||
|
|||
def init_app(self, app): |
|||
"""Initialize database logging.""" |
|||
from sqlalchemy import event |
|||
from sqlalchemy.engine import Engine |
|||
|
|||
@event.listens_for(Engine, "before_cursor_execute") |
|||
def receive_before_cursor_execute(conn, cursor, statement, parameters, context, executemany): |
|||
context._query_start_time = time.time() |
|||
|
|||
@event.listens_for(Engine, "after_cursor_execute") |
|||
def receive_after_cursor_execute(conn, cursor, statement, parameters, context, executemany): |
|||
total = time.time() - context._query_start_time |
|||
duration_ms = total * 1000 |
|||
|
|||
# Extract table name from query |
|||
table_name = self.extract_table_name(statement) |
|||
query_type = statement.strip().split()[0].upper() if statement.strip() else 'UNKNOWN' |
|||
|
|||
# Log slow queries (> 100ms) |
|||
if duration_ms > 100: |
|||
performance_logger.logger.warning( |
|||
f"SLOW_QUERY: {query_type} on {table_name} took {duration_ms:.2f}ms", |
|||
query_type=query_type, |
|||
table=table_name, |
|||
duration_ms=duration_ms, |
|||
statement=statement[:200] + "..." if len(statement) > 200 else statement |
|||
) |
|||
|
|||
# Log all database queries |
|||
performance_logger.log_database_query( |
|||
query_type=query_type, |
|||
table=table_name, |
|||
duration_ms=duration_ms |
|||
) |
|||
|
|||
def extract_table_name(self, statement): |
|||
"""Extract table name from SQL statement.""" |
|||
try: |
|||
statement_upper = statement.upper() |
|||
if 'FROM ' in statement_upper: |
|||
parts = statement_upper.split('FROM ')[1].split() |
|||
return parts[0].strip('`"[]') if parts else 'unknown' |
|||
elif 'INTO ' in statement_upper: |
|||
parts = statement_upper.split('INTO ')[1].split() |
|||
return parts[0].strip('`"[]') if parts else 'unknown' |
|||
elif 'UPDATE ' in statement_upper: |
|||
parts = statement_upper.split('UPDATE ')[1].split() |
|||
return parts[0].strip('`"[]') if parts else 'unknown' |
|||
else: |
|||
return 'unknown' |
|||
except: |
|||
return 'unknown' |
|||
|
|||
class SecurityMiddleware: |
|||
"""Specialized middleware for security monitoring.""" |
|||
|
|||
def __init__(self, app=None): |
|||
self.app = app |
|||
if app is not None: |
|||
self.init_app(app) |
|||
|
|||
def init_app(self, app): |
|||
"""Initialize security middleware.""" |
|||
app.before_request(self.security_check) |
|||
|
|||
def security_check(self): |
|||
"""Perform security checks on each request.""" |
|||
# Skip security checks for static files |
|||
if request.endpoint == 'static' or request.path.startswith('/static/'): |
|||
return |
|||
|
|||
# Rate limiting check (basic implementation) |
|||
self.check_rate_limiting() |
|||
|
|||
# Check for banned IPs or user agents |
|||
self.check_blacklist() |
|||
|
|||
# Monitor for admin access |
|||
self.monitor_admin_access() |
|||
|
|||
def check_rate_limiting(self): |
|||
"""Basic rate limiting check.""" |
|||
# This would typically use Redis for distributed rate limiting |
|||
client_ip = request.environ.get('REMOTE_ADDR', 'Unknown') |
|||
|
|||
# For now, just log high-frequency requests |
|||
if hasattr(g, 'request_count'): |
|||
g.request_count += 1 |
|||
else: |
|||
g.request_count = 1 |
|||
|
|||
def check_blacklist(self): |
|||
"""Check if IP or user agent is blacklisted.""" |
|||
client_ip = request.environ.get('REMOTE_ADDR', 'Unknown') |
|||
user_agent = request.headers.get('User-Agent', 'Unknown') |
|||
|
|||
# Example blacklist (would be configurable) |
|||
blacklisted_ips = [] |
|||
suspicious_agents = ['sqlmap', 'nikto', 'nmap', 'masscan'] |
|||
|
|||
if client_ip in blacklisted_ips: |
|||
security_logger.logger.critical( |
|||
f"BLACKLISTED_IP: Access attempt from blacklisted IP", |
|||
client_ip=client_ip, |
|||
path=request.path, |
|||
user_agent=user_agent |
|||
) |
|||
|
|||
if any(agent in user_agent.lower() for agent in suspicious_agents): |
|||
security_logger.logger.warning( |
|||
f"SUSPICIOUS_AGENT: Request with suspicious user agent", |
|||
client_ip=client_ip, |
|||
user_agent=user_agent, |
|||
path=request.path |
|||
) |
|||
|
|||
def monitor_admin_access(self): |
|||
"""Monitor access to admin endpoints.""" |
|||
admin_endpoints = ['/admin', '/management', '/config', '/settings'] |
|||
|
|||
if any(endpoint in request.path for endpoint in admin_endpoints): |
|||
try: |
|||
user_id = current_user.id if hasattr(current_user, 'is_authenticated') and current_user.is_authenticated else None |
|||
except: |
|||
user_id = None |
|||
|
|||
security_logger.logger.info( |
|||
f"ADMIN_ACCESS: Access to admin endpoint", |
|||
path=request.path, |
|||
client_ip=request.environ.get('REMOTE_ADDR', 'Unknown'), |
|||
user_agent=request.headers.get('User-Agent', 'Unknown'), |
|||
user_id=user_id |
|||
) |
|||
@ -0,0 +1,307 @@ |
|||
""" |
|||
Notification service for Plutus payment processing application. |
|||
|
|||
Handles email notifications for payment failures and success reports. |
|||
Integrates with existing emailclass.py for SMTP functionality. |
|||
""" |
|||
|
|||
from datetime import datetime |
|||
from typing import List, Dict, Any, Optional |
|||
from emailclass import SendEmail |
|||
import json |
|||
import logging |
|||
|
|||
logger = logging.getLogger(__name__) |
|||
|
|||
class NotificationService: |
|||
def __init__(self): |
|||
self.email_client = SendEmail() |
|||
|
|||
def send_payment_failure_notification( |
|||
self, |
|||
payment_data: Dict[str, Any], |
|||
recipient_email: str = "alan.woodman@interphone.com.au" |
|||
) -> bool: |
|||
""" |
|||
Send email notification for a failed payment. |
|||
|
|||
Args: |
|||
payment_data: Dictionary containing payment information |
|||
recipient_email: Email address to send notification to |
|||
|
|||
Returns: |
|||
bool: True if email sent successfully |
|||
""" |
|||
try: |
|||
# Extract payment information |
|||
splynx_id = payment_data.get('splynx_id', 'Unknown') |
|||
payment_id = payment_data.get('payment_id', 'Unknown') |
|||
amount = payment_data.get('amount', 0.0) |
|||
error = payment_data.get('error', 'Unknown error') |
|||
payment_method = payment_data.get('payment_method', 'Unknown') |
|||
customer_name = payment_data.get('customer_name', 'Unknown Customer') |
|||
|
|||
# Configure email |
|||
self.email_client.receiver = recipient_email |
|||
self.email_client.subject = f"Payment Failure - Customer {splynx_id} - ${amount:.2f}" |
|||
self.email_client.message_type = "html" |
|||
|
|||
# Create HTML email content |
|||
html_content = self._create_failure_email_html( |
|||
payment_data, splynx_id, payment_id, amount, error, |
|||
payment_method, customer_name |
|||
) |
|||
self.email_client.message_body_html = html_content |
|||
|
|||
# Send email |
|||
result = self.email_client.send() |
|||
|
|||
if result: |
|||
logger.info(f"Payment failure email sent successfully for payment {payment_id}") |
|||
else: |
|||
logger.error(f"Failed to send payment failure email for payment {payment_id}") |
|||
|
|||
return result |
|||
|
|||
except Exception as e: |
|||
logger.error(f"Error sending payment failure notification: {e}") |
|||
return False |
|||
|
|||
def send_batch_summary_email( |
|||
self, |
|||
batch_summary: Dict[str, Any], |
|||
recipient_email: str = "alan.woodman@interphone.com.au" |
|||
) -> bool: |
|||
""" |
|||
Send email summary for batch payment processing. |
|||
|
|||
Args: |
|||
batch_summary: Dictionary containing batch processing summary |
|||
recipient_email: Email address to send summary to |
|||
|
|||
Returns: |
|||
bool: True if email sent successfully |
|||
""" |
|||
try: |
|||
batch_id = batch_summary.get('batch_id', 'Unknown') |
|||
total_processed = batch_summary.get('total_processed', 0) |
|||
successful_count = batch_summary.get('successful_count', 0) |
|||
failed_count = batch_summary.get('failed_count', 0) |
|||
total_amount = batch_summary.get('total_amount', 0.0) |
|||
|
|||
# Configure email |
|||
self.email_client.receiver = recipient_email |
|||
self.email_client.subject = f"Batch Payment Summary - Batch #{batch_id} - {successful_count}/{total_processed} Successful" |
|||
self.email_client.message_type = "html" |
|||
|
|||
# Create HTML email content |
|||
html_content = self._create_batch_summary_html(batch_summary) |
|||
self.email_client.message_body_html = html_content |
|||
|
|||
# Send email |
|||
result = self.email_client.send() |
|||
|
|||
if result: |
|||
logger.info(f"Batch summary email sent successfully for batch {batch_id}") |
|||
else: |
|||
logger.error(f"Failed to send batch summary email for batch {batch_id}") |
|||
|
|||
return result |
|||
|
|||
except Exception as e: |
|||
logger.error(f"Error sending batch summary email: {e}") |
|||
return False |
|||
|
|||
def _create_failure_email_html( |
|||
self, |
|||
payment_data: Dict[str, Any], |
|||
splynx_id: str, |
|||
payment_id: str, |
|||
amount: float, |
|||
error: str, |
|||
payment_method: str, |
|||
customer_name: str |
|||
) -> str: |
|||
"""Create HTML content for payment failure email.""" |
|||
|
|||
timestamp = datetime.now().strftime('%Y-%m-%d %H:%M:%S') |
|||
|
|||
return f""" |
|||
<html> |
|||
<head> |
|||
<style> |
|||
body {{ font-family: Arial, sans-serif; line-height: 1.6; color: #333; }} |
|||
.header {{ background-color: #dc3545; color: white; padding: 20px; text-align: center; }} |
|||
.content {{ padding: 20px; }} |
|||
.info-table {{ width: 100%; border-collapse: collapse; margin: 20px 0; }} |
|||
.info-table th, .info-table td {{ border: 1px solid #ddd; padding: 12px; text-align: left; }} |
|||
.info-table th {{ background-color: #f8f9fa; font-weight: bold; }} |
|||
.error-box {{ background-color: #f8d7da; border: 1px solid #f5c6cb; border-radius: 4px; padding: 15px; margin: 20px 0; }} |
|||
.footer {{ background-color: #f8f9fa; padding: 15px; text-align: center; color: #666; }} |
|||
.alert {{ padding: 10px; margin: 10px 0; border-radius: 4px; }} |
|||
.alert-danger {{ background-color: #f8d7da; border-left: 4px solid #dc3545; }} |
|||
</style> |
|||
</head> |
|||
<body> |
|||
<div class="header"> |
|||
<h1>🚨 Payment Processing Failure</h1> |
|||
<p>Plutus Payment System Alert</p> |
|||
</div> |
|||
|
|||
<div class="content"> |
|||
<div class="alert alert-danger"> |
|||
<strong>Payment Failed:</strong> A payment processing attempt has failed and requires attention. |
|||
</div> |
|||
|
|||
<h2>Payment Details</h2> |
|||
<table class="info-table"> |
|||
<tr><th>Payment ID</th><td>{payment_id}</td></tr> |
|||
<tr><th>Splynx Customer ID</th><td><a href="https://billing.interphone.com.au/admin/customers/view?id={splynx_id}">{splynx_id}</a></td></tr> |
|||
<tr><th>Customer Name</th><td>{customer_name}</td></tr> |
|||
<tr><th>Payment Amount</th><td>${amount:.2f} AUD</td></tr> |
|||
<tr><th>Payment Method</th><td>{payment_method}</td></tr> |
|||
<tr><th>Timestamp</th><td>{timestamp}</td></tr> |
|||
</table> |
|||
|
|||
<h2>Error Information</h2> |
|||
<div class="error-box"> |
|||
<h3>Error Details:</h3> |
|||
<p>{error}</p> |
|||
</div> |
|||
|
|||
<h2>Recommended Actions</h2> |
|||
<ul> |
|||
<li>Review customer payment method in Splynx</li> |
|||
<li>Contact customer about payment failure</li> |
|||
<li>Check if customer needs to update payment details</li> |
|||
<li>Consider creating a support ticket</li> |
|||
</ul> |
|||
|
|||
<p><strong>System Links:</strong></p> |
|||
<ul> |
|||
<li><a href="https://billing.interphone.com.au/admin/customers/view?id={splynx_id}">View Customer in Splynx</a></li> |
|||
<li><a href="http://your-plutus-url/single-payment/detail/{payment_id}">View Payment in Plutus</a></li> |
|||
</ul> |
|||
</div> |
|||
|
|||
<div class="footer"> |
|||
<p>This email was automatically generated by the Plutus Payment System</p> |
|||
<p>Timestamp: {timestamp}</p> |
|||
</div> |
|||
</body> |
|||
</html> |
|||
""" |
|||
|
|||
def _create_batch_summary_html(self, batch_summary: Dict[str, Any]) -> str: |
|||
"""Create HTML content for batch summary email.""" |
|||
|
|||
batch_id = batch_summary.get('batch_id', 'Unknown') |
|||
total_processed = batch_summary.get('total_processed', 0) |
|||
successful_count = batch_summary.get('successful_count', 0) |
|||
failed_count = batch_summary.get('failed_count', 0) |
|||
total_amount = batch_summary.get('total_amount', 0.0) |
|||
success_amount = batch_summary.get('success_amount', 0.0) |
|||
failed_payments = batch_summary.get('failed_payments', []) |
|||
timestamp = datetime.now().strftime('%Y-%m-%d %H:%M:%S') |
|||
|
|||
# Determine status color |
|||
if failed_count == 0: |
|||
status_color = "#28a745" # Green |
|||
status_text = "✅ All Successful" |
|||
elif successful_count == 0: |
|||
status_color = "#dc3545" # Red |
|||
status_text = "❌ All Failed" |
|||
else: |
|||
status_color = "#ffc107" # Yellow |
|||
status_text = "⚠️ Partial Success" |
|||
|
|||
# Build failed payments table |
|||
failed_payments_html = "" |
|||
if failed_payments: |
|||
failed_payments_html = """ |
|||
<h2>Failed Payments</h2> |
|||
<table class="info-table"> |
|||
<tr><th>Payment ID</th><th>Customer ID</th><th>Amount</th><th>Error</th></tr> |
|||
""" |
|||
for payment in failed_payments[:10]: # Limit to first 10 failures |
|||
failed_payments_html += f""" |
|||
<tr> |
|||
<td>{payment.get('id', 'N/A')}</td> |
|||
<td><a href="https://billing.interphone.com.au/admin/customers/view?id={payment.get('splynx_id', 'N/A')}">{payment.get('splynx_id', 'N/A')}</a></td> |
|||
<td>${payment.get('amount', 0.0):.2f}</td> |
|||
<td>{payment.get('error', 'Unknown error')[:100]}...</td> |
|||
</tr> |
|||
""" |
|||
failed_payments_html += "</table>" |
|||
|
|||
if len(failed_payments) > 10: |
|||
failed_payments_html += f"<p><em>... and {len(failed_payments) - 10} more failed payments</em></p>" |
|||
|
|||
return f""" |
|||
<html> |
|||
<head> |
|||
<style> |
|||
body {{ font-family: Arial, sans-serif; line-height: 1.6; color: #333; }} |
|||
.header {{ background-color: {status_color}; color: white; padding: 20px; text-align: center; }} |
|||
.content {{ padding: 20px; }} |
|||
.info-table {{ width: 100%; border-collapse: collapse; margin: 20px 0; }} |
|||
.info-table th, .info-table td {{ border: 1px solid #ddd; padding: 12px; text-align: left; }} |
|||
.info-table th {{ background-color: #f8f9fa; font-weight: bold; }} |
|||
.footer {{ background-color: #f8f9fa; padding: 15px; text-align: center; color: #666; }} |
|||
.stats {{ display: flex; justify-content: space-around; margin: 20px 0; }} |
|||
.stat-box {{ text-align: center; padding: 15px; border-radius: 8px; }} |
|||
.stat-success {{ background-color: #d4edda; }} |
|||
.stat-danger {{ background-color: #f8d7da; }} |
|||
.stat-info {{ background-color: #d1ecf1; }} |
|||
</style> |
|||
</head> |
|||
<body> |
|||
<div class="header"> |
|||
<h1>📊 Batch Payment Summary</h1> |
|||
<h2>Batch #{batch_id}</h2> |
|||
<h3>{status_text}</h3> |
|||
</div> |
|||
|
|||
<div class="content"> |
|||
<h2>Processing Summary</h2> |
|||
<div class="stats"> |
|||
<div class="stat-box stat-info"> |
|||
<h3>{total_processed}</h3> |
|||
<p>Total Processed</p> |
|||
</div> |
|||
<div class="stat-box stat-success"> |
|||
<h3>{successful_count}</h3> |
|||
<p>Successful</p> |
|||
</div> |
|||
<div class="stat-box stat-danger"> |
|||
<h3>{failed_count}</h3> |
|||
<p>Failed</p> |
|||
</div> |
|||
</div> |
|||
|
|||
<table class="info-table"> |
|||
<tr><th>Metric</th><th>Value</th></tr> |
|||
<tr><td>Batch ID</td><td>#{batch_id}</td></tr> |
|||
<tr><td>Total Amount Processed</td><td>${total_amount:.2f} AUD</td></tr> |
|||
<tr><td>Successful Amount</td><td>${success_amount:.2f} AUD</td></tr> |
|||
<tr><td>Success Rate</td><td>{(successful_count/total_processed*100):.1f}%</td></tr> |
|||
<tr><td>Processing Time</td><td>{timestamp}</td></tr> |
|||
</table> |
|||
|
|||
{failed_payments_html} |
|||
|
|||
<h2>Actions Required</h2> |
|||
<ul> |
|||
{"<li>✅ No action required - all payments successful</li>" if failed_count == 0 else f"<li>⚠️ Review {failed_count} failed payment(s)</li>"} |
|||
{"<li>📧 Consider contacting customers with failed payments</li>" if failed_count > 0 else ""} |
|||
<li>📊 <a href='http://your-plutus-url/batch/{batch_id}'>View detailed batch report</a></li> |
|||
</ul> |
|||
</div> |
|||
|
|||
<div class="footer"> |
|||
<p>This email was automatically generated by the Plutus Payment System</p> |
|||
<p>Batch processed at: {timestamp}</p> |
|||
</div> |
|||
</body> |
|||
</html> |
|||
""" |
|||
@ -0,0 +1,105 @@ |
|||
""" |
|||
Permission system for Plutus payment processing application. |
|||
|
|||
Provides role-based access control with three permission levels: |
|||
- Admin: Full access to all features |
|||
- Finance: All features except user management |
|||
- Helpdesk: View-only access + single payment processing |
|||
""" |
|||
|
|||
from functools import wraps |
|||
from flask import abort, flash, redirect, url_for, request |
|||
from flask_login import current_user |
|||
|
|||
|
|||
# Permission levels (hierarchical) |
|||
PERMISSION_LEVELS = { |
|||
'Admin': 3, |
|||
'Finance': 2, |
|||
'Helpdesk': 1 |
|||
} |
|||
|
|||
def get_user_permission_level(user): |
|||
"""Get the numeric permission level for a user.""" |
|||
if not user or not user.is_authenticated: |
|||
return 0 |
|||
|
|||
user_permission = getattr(user, 'Permissions', '').strip() |
|||
return PERMISSION_LEVELS.get(user_permission, 0) |
|||
|
|||
def has_permission(required_permission): |
|||
"""Check if current user has the required permission level.""" |
|||
if not current_user or not current_user.is_authenticated: |
|||
return False |
|||
|
|||
user_level = get_user_permission_level(current_user) |
|||
required_level = PERMISSION_LEVELS.get(required_permission, 999) |
|||
|
|||
return user_level >= required_level |
|||
|
|||
def require_permission(required_permission): |
|||
""" |
|||
Decorator to require a specific permission level for route access. |
|||
|
|||
Args: |
|||
required_permission (str): Permission level required ('Admin', 'Finance', 'Helpdesk') |
|||
""" |
|||
def decorator(f): |
|||
@wraps(f) |
|||
def decorated_function(*args, **kwargs): |
|||
if not current_user.is_authenticated: |
|||
flash('Please log in to access this page.', 'error') |
|||
return redirect(url_for('auth.login', next=request.url)) |
|||
|
|||
if not has_permission(required_permission): |
|||
flash('You do not have permission to access this page.', 'error') |
|||
return redirect(url_for('main.index')) |
|||
|
|||
return f(*args, **kwargs) |
|||
return decorated_function |
|||
return decorator |
|||
|
|||
def admin_required(f): |
|||
"""Decorator requiring Admin permission.""" |
|||
return require_permission('Admin')(f) |
|||
|
|||
def finance_required(f): |
|||
"""Decorator requiring Finance permission or higher.""" |
|||
return require_permission('Finance')(f) |
|||
|
|||
def helpdesk_required(f): |
|||
"""Decorator requiring Helpdesk permission or higher.""" |
|||
return require_permission('Helpdesk')(f) |
|||
|
|||
# Template helper functions |
|||
def can_manage_users(): |
|||
"""Check if current user can manage users (Admin only).""" |
|||
return has_permission('Admin') |
|||
|
|||
def can_manage_payments(): |
|||
"""Check if current user can manage payments (Finance or Admin).""" |
|||
return has_permission('Finance') |
|||
|
|||
def can_view_data(): |
|||
"""Check if current user can view data (any authenticated user).""" |
|||
return has_permission('Helpdesk') |
|||
|
|||
def can_process_single_payments(): |
|||
"""Check if current user can process single payments (Helpdesk or higher).""" |
|||
return has_permission('Helpdesk') |
|||
|
|||
def can_manage_batch_payments(): |
|||
"""Check if current user can manage batch payments (Finance or Admin).""" |
|||
return has_permission('Finance') |
|||
|
|||
def can_manage_payment_plans(): |
|||
"""Check if current user can manage payment plans (Finance or Admin).""" |
|||
return has_permission('Finance') |
|||
|
|||
def can_view_logs(): |
|||
"""Check if current user can view system logs (Finance or Admin).""" |
|||
return has_permission('Finance') |
|||
|
|||
def can_export_data(): |
|||
"""Check if current user can export data (Finance or Admin).""" |
|||
return has_permission('Finance') |
|||
@ -0,0 +1,838 @@ |
|||
{% extends "base.html" %} |
|||
|
|||
{% block title %}Analytics Dashboard - Plutus Payment System{% endblock %} |
|||
|
|||
{% block head %} |
|||
<style> |
|||
.metric-card { |
|||
background: white; |
|||
border-radius: 8px; |
|||
padding: 20px; |
|||
box-shadow: 0 2px 4px rgba(0,0,0,0.1); |
|||
margin-bottom: 20px; |
|||
text-align: center; |
|||
} |
|||
|
|||
.metric-value { |
|||
font-size: 2rem; |
|||
font-weight: bold; |
|||
margin-bottom: 5px; |
|||
} |
|||
|
|||
.metric-label { |
|||
color: #666; |
|||
font-size: 0.9rem; |
|||
text-transform: uppercase; |
|||
letter-spacing: 0.5px; |
|||
} |
|||
|
|||
.health-score { |
|||
display: flex; |
|||
align-items: center; |
|||
justify-content: center; |
|||
width: 100px; |
|||
height: 100px; |
|||
border-radius: 50%; |
|||
margin: 0 auto 15px; |
|||
font-size: 1.5rem; |
|||
font-weight: bold; |
|||
color: white; |
|||
} |
|||
|
|||
.health-excellent { background: linear-gradient(135deg, #48c774, #00d1b2); } |
|||
.health-good { background: linear-gradient(135deg, #48c774, #ffdd57); } |
|||
.health-warning { background: linear-gradient(135deg, #ffdd57, #ff9800); } |
|||
.health-critical { background: linear-gradient(135deg, #ff9800, #f14668); } |
|||
|
|||
.tab-content { |
|||
margin-top: 20px; |
|||
min-height: 400px; |
|||
} |
|||
|
|||
.log-entry { |
|||
background: #f8f9fa; |
|||
border-left: 4px solid #3273dc; |
|||
padding: 12px; |
|||
margin-bottom: 10px; |
|||
border-radius: 4px; |
|||
} |
|||
|
|||
.loading { |
|||
opacity: 0.6; |
|||
pointer-events: none; |
|||
} |
|||
|
|||
/* Tab styling improvements */ |
|||
.tabs ul { |
|||
border-bottom: 2px solid #dbdbdb; |
|||
} |
|||
|
|||
.tabs li { |
|||
background-color: #f5f5f5; |
|||
border: 1px solid #dbdbdb; |
|||
border-bottom: none; |
|||
margin-right: 2px; |
|||
} |
|||
|
|||
.tabs li:hover { |
|||
background-color: #e8e8e8; |
|||
} |
|||
|
|||
.tabs li.is-active { |
|||
background-color: #ffffff; |
|||
border-color: #3273dc; |
|||
border-bottom: 2px solid #ffffff; |
|||
margin-bottom: -2px; |
|||
position: relative; |
|||
z-index: 1; |
|||
} |
|||
|
|||
.tabs li a { |
|||
color: #4a4a4a; |
|||
font-weight: 500; |
|||
padding: 0.75em 1em; |
|||
border: none; |
|||
background: transparent; |
|||
} |
|||
|
|||
.tabs li.is-active a { |
|||
color: #3273dc; |
|||
font-weight: 600; |
|||
} |
|||
|
|||
.tabs li:hover a { |
|||
color: #363636; |
|||
} |
|||
</style> |
|||
<script src="https://code.jquery.com/jquery-3.6.0.min.js"></script> |
|||
{% endblock %} |
|||
|
|||
{% block content %} |
|||
<section class="section"> |
|||
<div class="container"> |
|||
<div class="level"> |
|||
<div class="level-left"> |
|||
<div class="level-item"> |
|||
<h1 class="title"> |
|||
<span class="icon"> |
|||
<i class="fas fa-chart-line"></i> |
|||
</span> |
|||
Analytics Dashboard |
|||
</h1> |
|||
</div> |
|||
</div> |
|||
<div class="level-right"> |
|||
<div class="level-item"> |
|||
<div class="field is-grouped"> |
|||
<p class="control"> |
|||
<button class="button is-light" onclick="refreshDashboard()"> |
|||
<span class="icon"> |
|||
<i class="fas fa-sync-alt"></i> |
|||
</span> |
|||
<span>Refresh</span> |
|||
</button> |
|||
</p> |
|||
<p class="control"> |
|||
<button class="button is-primary" onclick="exportReport()"> |
|||
<span class="icon"> |
|||
<i class="fas fa-download"></i> |
|||
</span> |
|||
<span>Export</span> |
|||
</button> |
|||
</p> |
|||
</div> |
|||
</div> |
|||
</div> |
|||
</div> |
|||
|
|||
<!-- System Health Overview --> |
|||
<div class="columns"> |
|||
<div class="column is-3"> |
|||
<div class="metric-card"> |
|||
<div id="healthScore" class="health-score"> |
|||
<span id="healthValue">--</span> |
|||
</div> |
|||
<div class="metric-label">System Health Score</div> |
|||
<p class="is-size-7 has-text-grey">Overall system performance</p> |
|||
</div> |
|||
</div> |
|||
<div class="column is-3"> |
|||
<div class="metric-card"> |
|||
<div id="paymentSuccessRate" class="metric-value has-text-success">--%</div> |
|||
<div class="metric-label">Payment Success Rate</div> |
|||
<p class="is-size-7 has-text-grey">Last 24 hours</p> |
|||
</div> |
|||
</div> |
|||
<div class="column is-3"> |
|||
<div class="metric-card"> |
|||
<div id="errorRate" class="metric-value has-text-warning">--%</div> |
|||
<div class="metric-label">Error Rate</div> |
|||
<p class="is-size-7 has-text-grey">System errors in logs</p> |
|||
</div> |
|||
</div> |
|||
<div class="column is-3"> |
|||
<div class="metric-card"> |
|||
<div id="totalPayments" class="metric-value has-text-info">--</div> |
|||
<div class="metric-label">Total Payments</div> |
|||
<p class="is-size-7 has-text-grey">Recent activity</p> |
|||
</div> |
|||
</div> |
|||
</div> |
|||
|
|||
<!-- Tabs --> |
|||
<div class="tabs is-centered"> |
|||
<ul> |
|||
<li class="is-active" data-tab="performance"> |
|||
<a> |
|||
<span class="icon is-small"><i class="fas fa-tachometer-alt"></i></span> |
|||
<span>Performance</span> |
|||
</a> |
|||
</li> |
|||
<li data-tab="payments"> |
|||
<a> |
|||
<span class="icon is-small"><i class="fas fa-credit-card"></i></span> |
|||
<span>Payments</span> |
|||
</a> |
|||
</li> |
|||
<li data-tab="security"> |
|||
<a> |
|||
<span class="icon is-small"><i class="fas fa-shield-alt"></i></span> |
|||
<span>Security</span> |
|||
</a> |
|||
</li> |
|||
<li data-tab="logs"> |
|||
<a> |
|||
<span class="icon is-small"><i class="fas fa-list-alt"></i></span> |
|||
<span>Logs</span> |
|||
</a> |
|||
</li> |
|||
</ul> |
|||
</div> |
|||
|
|||
<!-- Tab Content --> |
|||
<div class="tab-content"> |
|||
<!-- Performance Tab --> |
|||
<div id="performance-content" class="tab-pane is-active"> |
|||
<div class="box"> |
|||
<h4 class="title is-4"> |
|||
<span class="icon"> |
|||
<i class="fas fa-clock"></i> |
|||
</span> |
|||
System Performance |
|||
</h4> |
|||
<div id="performanceMetrics"> |
|||
<div class="has-text-centered"> |
|||
<span class="icon is-large"> |
|||
<i class="fas fa-spinner fa-pulse"></i> |
|||
</span> |
|||
<p>Loading performance metrics...</p> |
|||
</div> |
|||
</div> |
|||
</div> |
|||
</div> |
|||
|
|||
<!-- Payments Tab --> |
|||
<div id="payments-content" class="tab-pane" style="display: none;"> |
|||
<div class="box"> |
|||
<h4 class="title is-4"> |
|||
<span class="icon"> |
|||
<i class="fas fa-chart-line"></i> |
|||
</span> |
|||
Payment Analytics |
|||
</h4> |
|||
<div id="paymentMetrics"> |
|||
<div class="has-text-centered"> |
|||
<span class="icon is-large"> |
|||
<i class="fas fa-spinner fa-pulse"></i> |
|||
</span> |
|||
<p>Loading payment analytics...</p> |
|||
</div> |
|||
</div> |
|||
</div> |
|||
</div> |
|||
|
|||
<!-- Security Tab --> |
|||
<div id="security-content" class="tab-pane" style="display: none;"> |
|||
<div class="columns"> |
|||
<div class="column is-4"> |
|||
<div class="metric-card"> |
|||
<div id="securityEvents" class="metric-value has-text-success">--</div> |
|||
<div class="metric-label">Security Events</div> |
|||
<p class="is-size-7 has-text-grey">Last 7 days</p> |
|||
</div> |
|||
</div> |
|||
<div class="column is-4"> |
|||
<div class="metric-card"> |
|||
<div id="failedLogins" class="metric-value has-text-warning">--</div> |
|||
<div class="metric-label">Failed Logins</div> |
|||
<p class="is-size-7 has-text-grey">Authentication failures</p> |
|||
</div> |
|||
</div> |
|||
<div class="column is-4"> |
|||
<div class="metric-card"> |
|||
<div id="blockedRequests" class="metric-value has-text-danger">--</div> |
|||
<div class="metric-label">Blocked Requests</div> |
|||
<p class="is-size-7 has-text-grey">Suspicious activity</p> |
|||
</div> |
|||
</div> |
|||
</div> |
|||
<div class="box"> |
|||
<h4 class="title is-4"> |
|||
<span class="icon"> |
|||
<i class="fas fa-shield-alt"></i> |
|||
</span> |
|||
Security Events |
|||
</h4> |
|||
<div id="securityEventsList"> |
|||
<div class="has-text-centered"> |
|||
<span class="icon is-large"> |
|||
<i class="fas fa-spinner fa-pulse"></i> |
|||
</span> |
|||
<p>Loading security events...</p> |
|||
</div> |
|||
</div> |
|||
</div> |
|||
</div> |
|||
|
|||
<!-- Logs Tab --> |
|||
<div id="logs-content" class="tab-pane" style="display: none;"> |
|||
<div class="box"> |
|||
<div class="level"> |
|||
<div class="level-left"> |
|||
<div class="level-item"> |
|||
<h4 class="title is-4"> |
|||
<span class="icon"> |
|||
<i class="fas fa-search"></i> |
|||
</span> |
|||
Log Search |
|||
</h4> |
|||
</div> |
|||
</div> |
|||
<div class="level-right"> |
|||
<div class="level-item"> |
|||
<div class="field has-addons"> |
|||
<div class="control"> |
|||
<input type="text" id="logSearch" class="input" placeholder="Search logs..."> |
|||
</div> |
|||
<div class="control"> |
|||
<div class="select"> |
|||
<select id="logAction"> |
|||
<option value="">All Actions</option> |
|||
<option value="LOGIN_SUCCESS">Login Success</option> |
|||
<option value="LOGIN_FAILED">Login Failed</option> |
|||
<option value="PAYMENT_PROCESSED">Payment Processed</option> |
|||
<option value="BATCH_CREATED">Batch Created</option> |
|||
</select> |
|||
</div> |
|||
</div> |
|||
<div class="control"> |
|||
<button class="button is-primary" onclick="searchLogs()"> |
|||
<span class="icon"> |
|||
<i class="fas fa-search"></i> |
|||
</span> |
|||
</button> |
|||
</div> |
|||
</div> |
|||
</div> |
|||
</div> |
|||
</div> |
|||
<div id="logResults"> |
|||
<div class="has-text-centered"> |
|||
<span class="icon is-large"> |
|||
<i class="fas fa-spinner fa-pulse"></i> |
|||
</span> |
|||
<p>Loading recent logs...</p> |
|||
</div> |
|||
</div> |
|||
</div> |
|||
</div> |
|||
</div> |
|||
</div> |
|||
</section> |
|||
|
|||
<script> |
|||
$(document).ready(function() { |
|||
// Initialize dashboard |
|||
refreshSystemHealth(); |
|||
|
|||
// Load the default active tab content (Performance) |
|||
loadTabContent('performance'); |
|||
|
|||
// Tab switching |
|||
$('.tabs li').click(function() { |
|||
var tab = $(this).data('tab'); |
|||
|
|||
// Update tab appearance |
|||
$('.tabs li').removeClass('is-active'); |
|||
$(this).addClass('is-active'); |
|||
|
|||
// Show/hide content |
|||
$('.tab-pane').hide().removeClass('is-active'); |
|||
$('#' + tab + '-content').show().addClass('is-active'); |
|||
|
|||
// Load tab content |
|||
loadTabContent(tab); |
|||
}); |
|||
|
|||
// Auto-refresh every 30 seconds |
|||
setInterval(refreshSystemHealth, 30000); |
|||
}); |
|||
|
|||
function refreshSystemHealth() { |
|||
fetch('/analytics/api/system-health') |
|||
.then(response => response.json()) |
|||
.then(data => { |
|||
if (data.error) { |
|||
console.error('Error loading system health:', data.error); |
|||
if (data.debug_error) { |
|||
console.error('Debug error:', data.debug_error); |
|||
} |
|||
} |
|||
|
|||
updateHealthScore(data.health_score); |
|||
$('#paymentSuccessRate').text(data.payment_success_rate + '%'); |
|||
$('#errorRate').text(data.error_rate + '%'); |
|||
$('#totalPayments').text((data.metrics.total_payments || 0).toLocaleString()); |
|||
|
|||
// Update health score color |
|||
const healthElement = $('#healthScore'); |
|||
healthElement.removeClass('health-excellent health-good health-warning health-critical'); |
|||
|
|||
if (data.health_score >= 90) { |
|||
healthElement.addClass('health-excellent'); |
|||
} else if (data.health_score >= 75) { |
|||
healthElement.addClass('health-good'); |
|||
} else if (data.health_score >= 60) { |
|||
healthElement.addClass('health-warning'); |
|||
} else { |
|||
healthElement.addClass('health-critical'); |
|||
} |
|||
}) |
|||
.catch(error => { |
|||
console.error('Error loading system health:', error); |
|||
// Show mock data |
|||
updateHealthScore(85); |
|||
$('#paymentSuccessRate').text('95%'); |
|||
$('#errorRate').text('2.5%'); |
|||
$('#totalPayments').text('45'); |
|||
}); |
|||
} |
|||
|
|||
function updateHealthScore(score) { |
|||
$('#healthValue').text(Math.round(score || 0)); |
|||
} |
|||
|
|||
function loadTabContent(tab) { |
|||
switch(tab) { |
|||
case 'performance': |
|||
loadPerformanceMetrics(); |
|||
break; |
|||
case 'payments': |
|||
loadPaymentAnalytics(); |
|||
break; |
|||
case 'security': |
|||
loadSecurityEvents(); |
|||
break; |
|||
case 'logs': |
|||
searchLogs(); |
|||
break; |
|||
} |
|||
} |
|||
|
|||
function loadPerformanceMetrics() { |
|||
console.log('Loading performance metrics...'); |
|||
$('#performanceMetrics').html(` |
|||
<div class="has-text-centered"> |
|||
<span class="icon is-large"> |
|||
<i class="fas fa-spinner fa-pulse"></i> |
|||
</span> |
|||
<p>Loading performance metrics...</p> |
|||
</div> |
|||
`); |
|||
|
|||
// Add timeout to the fetch |
|||
const controller = new AbortController(); |
|||
const timeoutId = setTimeout(() => controller.abort(), 10000); // 10 second timeout |
|||
|
|||
fetch('/analytics/api/performance-metrics', { |
|||
signal: controller.signal, |
|||
headers: { |
|||
'Accept': 'application/json', |
|||
'Content-Type': 'application/json' |
|||
} |
|||
}) |
|||
.then(response => { |
|||
clearTimeout(timeoutId); |
|||
console.log('Performance metrics response:', response.status); |
|||
if (!response.ok) { |
|||
throw new Error(`HTTP ${response.status}: ${response.statusText}`); |
|||
} |
|||
return response.json(); |
|||
}) |
|||
.then(data => { |
|||
console.log('Performance metrics data:', data); |
|||
if (data.error) { |
|||
$('#performanceMetrics').html(` |
|||
<div class="notification is-info"> |
|||
<h5 class="title is-5"> |
|||
<span class="icon"><i class="fas fa-chart-line"></i></span> |
|||
Performance Monitoring |
|||
</h5> |
|||
<p><strong>Status:</strong> ${data.message || data.error}</p> |
|||
<p>The system is actively collecting performance data. Check back after some activity.</p> |
|||
${data.debug_error ? `<details><summary>Debug Info</summary><pre>${data.debug_error}</pre></details>` : ''} |
|||
</div> |
|||
`); |
|||
return; |
|||
} |
|||
|
|||
let html = '<div class="content">'; |
|||
|
|||
// System overview |
|||
if (data.system_info) { |
|||
html += '<div class="columns">'; |
|||
html += '<div class="column is-4">'; |
|||
html += '<div class="box has-text-centered">'; |
|||
html += '<p class="heading">Monitoring Status</p>'; |
|||
html += `<p class="title ${data.system_info.monitoring_active ? 'has-text-success' : 'has-text-warning'}">`; |
|||
html += `<span class="icon"><i class="fas ${data.system_info.monitoring_active ? 'fa-check-circle' : 'fa-exclamation-triangle'}"></i></span>`; |
|||
html += `${data.system_info.monitoring_active ? 'Active' : 'Inactive'}`; |
|||
html += '</p>'; |
|||
html += '</div></div>'; |
|||
|
|||
html += '<div class="column is-4">'; |
|||
html += '<div class="box has-text-centered">'; |
|||
html += '<p class="heading">Log Files</p>'; |
|||
html += `<p class="title has-text-info">${data.system_info.log_files_found || 0}</p>`; |
|||
html += '</div></div>'; |
|||
|
|||
html += '<div class="column is-4">'; |
|||
html += '<div class="box has-text-centered">'; |
|||
html += '<p class="heading">Collection Period</p>'; |
|||
html += `<p class="title has-text-grey">${data.system_info.data_collection_period || '7 days'}</p>`; |
|||
html += '</div></div>'; |
|||
html += '</div>'; |
|||
} |
|||
|
|||
// Performance summary |
|||
if (data.summary) { |
|||
html += '<h5 class="title is-5">Performance Summary</h5>'; |
|||
html += '<div class="columns">'; |
|||
html += '<div class="column is-3">'; |
|||
html += '<div class="box has-text-centered">'; |
|||
html += '<p class="heading">Slow Requests</p>'; |
|||
html += `<p class="title ${data.summary.slow_request_count > 0 ? 'has-text-warning' : 'has-text-success'}">${data.summary.slow_request_count || 0}</p>`; |
|||
html += '<p class="help">Requests > 1 second</p>'; |
|||
html += '</div></div>'; |
|||
|
|||
html += '<div class="column is-3">'; |
|||
html += '<div class="box has-text-centered">'; |
|||
html += '<p class="heading">Slow Queries</p>'; |
|||
html += `<p class="title ${data.summary.database_queries > 0 ? 'has-text-warning' : 'has-text-success'}">${data.summary.database_queries || 0}</p>`; |
|||
html += '<p class="help">DB queries > 100ms</p>'; |
|||
html += '</div></div>'; |
|||
|
|||
html += '<div class="column is-3">'; |
|||
html += '<div class="box has-text-centered">'; |
|||
html += '<p class="heading">Total Requests</p>'; |
|||
html += `<p class="title has-text-info">${data.summary.total_requests || 'N/A'}</p>`; |
|||
html += '<p class="help">Recent requests</p>'; |
|||
html += '</div></div>'; |
|||
|
|||
html += '<div class="column is-3">'; |
|||
html += '<div class="box has-text-centered">'; |
|||
html += '<p class="heading">Avg Response</p>'; |
|||
html += `<p class="title has-text-info">${data.summary.avg_response_time || 'N/A'}</p>`; |
|||
html += '<p class="help">Milliseconds</p>'; |
|||
html += '</div></div>'; |
|||
html += '</div>'; |
|||
} |
|||
|
|||
// Slow requests table |
|||
if (data.slow_requests && data.slow_requests.length > 0) { |
|||
html += '<h5 class="title is-5">Recent Slow Requests</h5>'; |
|||
html += '<table class="table is-striped is-fullwidth">'; |
|||
html += '<thead><tr><th>Time</th><th>Endpoint</th><th>Duration</th><th>Status</th></tr></thead>'; |
|||
html += '<tbody>'; |
|||
data.slow_requests.slice(0, 10).forEach(req => { |
|||
html += '<tr>'; |
|||
html += `<td>${new Date(req.timestamp).toLocaleString()}</td>`; |
|||
html += `<td><code>${req.endpoint}</code></td>`; |
|||
html += `<td><span class="tag is-warning">${Math.round(req.duration_ms)}ms</span></td>`; |
|||
html += `<td><span class="tag ${req.status_code < 400 ? 'is-success' : 'is-danger'}">${req.status_code}</span></td>`; |
|||
html += '</tr>'; |
|||
}); |
|||
html += '</tbody></table>'; |
|||
} else { |
|||
html += '<div class="notification is-success">'; |
|||
html += '<span class="icon"><i class="fas fa-check"></i></span>'; |
|||
html += '<strong>Good Performance!</strong> No slow requests detected recently.'; |
|||
html += '</div>'; |
|||
} |
|||
|
|||
// Slow queries table |
|||
if (data.slow_queries && data.slow_queries.length > 0) { |
|||
html += '<h5 class="title is-5">Recent Slow Database Queries</h5>'; |
|||
html += '<table class="table is-striped is-fullwidth">'; |
|||
html += '<thead><tr><th>Time</th><th>Table</th><th>Type</th><th>Duration</th></tr></thead>'; |
|||
html += '<tbody>'; |
|||
data.slow_queries.slice(0, 10).forEach(query => { |
|||
html += '<tr>'; |
|||
html += `<td>${new Date(query.timestamp).toLocaleString()}</td>`; |
|||
html += `<td><code>${query.table}</code></td>`; |
|||
html += `<td>${query.query_type}</td>`; |
|||
html += `<td><span class="tag is-warning">${Math.round(query.duration_ms)}ms</span></td>`; |
|||
html += '</tr>'; |
|||
}); |
|||
html += '</tbody></table>'; |
|||
} |
|||
|
|||
html += '</div>'; |
|||
$('#performanceMetrics').html(html); |
|||
}) |
|||
.catch(error => { |
|||
clearTimeout(timeoutId); |
|||
console.error('Error loading performance metrics:', error); |
|||
|
|||
let errorMessage = 'Unable to load performance metrics at this time.'; |
|||
if (error.name === 'AbortError') { |
|||
errorMessage = 'Request timed out after 10 seconds.'; |
|||
} else if (error.message.includes('HTTP')) { |
|||
errorMessage = `Server error: ${error.message}`; |
|||
} |
|||
|
|||
$('#performanceMetrics').html(` |
|||
<div class="notification is-warning"> |
|||
<h5 class="title is-5">Performance Data Loading Error</h5> |
|||
<p><strong>Error:</strong> ${errorMessage}</p> |
|||
<p>The system may be initializing or there may be a connectivity issue.</p> |
|||
<button class="button is-small is-primary" onclick="loadPerformanceMetrics()">Try Again</button> |
|||
<details style="margin-top: 10px;"> |
|||
<summary>Debug Information</summary> |
|||
<pre>${error.stack || error.message}</pre> |
|||
</details> |
|||
</div> |
|||
`); |
|||
}); |
|||
} |
|||
|
|||
function loadPaymentAnalytics() { |
|||
$('#paymentMetrics').html(` |
|||
<div class="has-text-centered"> |
|||
<span class="icon is-large"> |
|||
<i class="fas fa-spinner fa-pulse"></i> |
|||
</span> |
|||
<p>Loading payment analytics...</p> |
|||
</div> |
|||
`); |
|||
|
|||
fetch('/analytics/api/payment-analytics') |
|||
.then(response => response.json()) |
|||
.then(data => { |
|||
if (data.error) { |
|||
$('#paymentMetrics').html(` |
|||
<div class="notification is-info"> |
|||
<p><strong>Payment analytics:</strong> No recent payment data available.</p> |
|||
<p>Analytics will appear after payment processing activity.</p> |
|||
</div> |
|||
`); |
|||
return; |
|||
} |
|||
|
|||
let html = '<div class="content">'; |
|||
html += '<h5>Payment Analytics Overview</h5>'; |
|||
html += '<p>Payment analytics are being collected. Detailed metrics will appear with payment activity.</p>'; |
|||
html += '</div>'; |
|||
|
|||
$('#paymentMetrics').html(html); |
|||
}) |
|||
.catch(error => { |
|||
console.error('Error loading payment analytics:', error); |
|||
$('#paymentMetrics').html(` |
|||
<div class="notification is-info"> |
|||
<p>Payment analytics will be available after payment processing activity.</p> |
|||
</div> |
|||
`); |
|||
}); |
|||
} |
|||
|
|||
function loadSecurityEvents() { |
|||
$('#securityEventsList').html(` |
|||
<div class="has-text-centered"> |
|||
<span class="icon is-large"> |
|||
<i class="fas fa-spinner fa-pulse"></i> |
|||
</span> |
|||
<p>Loading security events...</p> |
|||
</div> |
|||
`); |
|||
|
|||
fetch('/analytics/api/security-events') |
|||
.then(response => response.json()) |
|||
.then(data => { |
|||
if (data.error) { |
|||
$('#securityEventsList').html(` |
|||
<div class="notification is-success"> |
|||
<p><strong>Security status:</strong> All clear - no security events detected.</p> |
|||
<p>The system is actively monitoring for security threats.</p> |
|||
</div> |
|||
`); |
|||
$('#securityEvents').text('0'); |
|||
$('#failedLogins').text('0'); |
|||
$('#blockedRequests').text('0'); |
|||
return; |
|||
} |
|||
|
|||
$('#securityEvents').text(data.summary.total_events); |
|||
$('#failedLogins').text(data.summary.failed_logins); |
|||
$('#blockedRequests').text(data.summary.blocked_requests); |
|||
|
|||
if (data.summary.total_events === 0) { |
|||
$('#securityEventsList').html(` |
|||
<div class="notification is-success"> |
|||
<span class="icon"> |
|||
<i class="fas fa-shield-alt"></i> |
|||
</span> |
|||
<strong>All clear!</strong> No security events detected. |
|||
</div> |
|||
`); |
|||
} else { |
|||
$('#securityEventsList').html('<p>Security events detected. Review logs for details.</p>'); |
|||
} |
|||
}) |
|||
.catch(error => { |
|||
console.error('Error loading security events:', error); |
|||
$('#securityEventsList').html(` |
|||
<div class="notification is-success"> |
|||
<span class="icon"> |
|||
<i class="fas fa-shield-alt"></i> |
|||
</span> |
|||
Security monitoring is active. |
|||
</div> |
|||
`); |
|||
$('#securityEvents').text('0'); |
|||
$('#failedLogins').text('0'); |
|||
$('#blockedRequests').text('0'); |
|||
}); |
|||
} |
|||
|
|||
function searchLogs() { |
|||
const query = $('#logSearch').val(); |
|||
const action = $('#logAction').val(); |
|||
|
|||
$('#logResults').html(` |
|||
<div class="has-text-centered"> |
|||
<span class="icon is-large"> |
|||
<i class="fas fa-spinner fa-pulse"></i> |
|||
</span> |
|||
<p>Searching logs...</p> |
|||
</div> |
|||
`); |
|||
|
|||
const params = new URLSearchParams({ page: 1, per_page: 20 }); |
|||
if (query) params.append('q', query); |
|||
if (action) params.append('action', action); |
|||
|
|||
fetch(`/analytics/api/logs/search?${params}`) |
|||
.then(response => response.json()) |
|||
.then(data => { |
|||
if (data.error) { |
|||
$('#logResults').html(` |
|||
<div class="notification is-warning"> |
|||
<p><strong>Log search error:</strong> ${data.error}</p> |
|||
</div> |
|||
`); |
|||
return; |
|||
} |
|||
|
|||
if (!data.logs || data.logs.length === 0) { |
|||
$('#logResults').html(` |
|||
<div class="notification is-info"> |
|||
<p>No logs found matching your search criteria.</p> |
|||
</div> |
|||
`); |
|||
return; |
|||
} |
|||
|
|||
let html = ''; |
|||
data.logs.forEach(log => { |
|||
const logClass = log.action.includes('ERROR') || log.action.includes('FAILED') ? 'has-background-danger-light' : |
|||
log.action.includes('WARNING') ? 'has-background-warning-light' : |
|||
log.action.includes('SUCCESS') ? 'has-background-success-light' : ''; |
|||
|
|||
html += `<div class="log-entry ${logClass}"> |
|||
<div class="level"> |
|||
<div class="level-left"> |
|||
<div class="level-item"> |
|||
<strong>${log.action}</strong> |
|||
</div> |
|||
</div> |
|||
<div class="level-right"> |
|||
<div class="level-item"> |
|||
<small class="has-text-grey">${new Date(log.timestamp).toLocaleString()}</small> |
|||
</div> |
|||
</div> |
|||
</div> |
|||
<div>${log.message || 'No message'}</div> |
|||
<small class="has-text-grey"> |
|||
${log.entity_type} ${log.entity_id || ''} | User ID: ${log.user_id || 'System'} | IP: ${log.ip_address || 'Unknown'} |
|||
</small> |
|||
</div>`; |
|||
}); |
|||
|
|||
$('#logResults').html(html); |
|||
}) |
|||
.catch(error => { |
|||
console.error('Error searching logs:', error); |
|||
$('#logResults').html(` |
|||
<div class="notification is-warning"> |
|||
<p>Error searching logs. Please try again.</p> |
|||
</div> |
|||
`); |
|||
}); |
|||
} |
|||
|
|||
function refreshDashboard() { |
|||
document.body.classList.add('loading'); |
|||
|
|||
refreshSystemHealth(); |
|||
|
|||
// Refresh current tab content |
|||
const activeTab = $('.tabs li.is-active').data('tab'); |
|||
if (activeTab) { |
|||
loadTabContent(activeTab); |
|||
} |
|||
|
|||
setTimeout(() => { |
|||
document.body.classList.remove('loading'); |
|||
}, 1000); |
|||
} |
|||
|
|||
function exportReport() { |
|||
const activeTab = $('.tabs li.is-active').data('tab') || 'system'; |
|||
|
|||
fetch(`/analytics/api/generate-report?type=${activeTab}&days=7`) |
|||
.then(response => response.json()) |
|||
.then(data => { |
|||
if (data.error) { |
|||
alert('Error generating report: ' + data.error); |
|||
return; |
|||
} |
|||
|
|||
// Create and download JSON report |
|||
const blob = new Blob([JSON.stringify(data, null, 2)], { type: 'application/json' }); |
|||
const url = URL.createObjectURL(blob); |
|||
const a = document.createElement('a'); |
|||
a.href = url; |
|||
a.download = `plutus_${activeTab}_report_${new Date().toISOString().split('T')[0]}.json`; |
|||
document.body.appendChild(a); |
|||
a.click(); |
|||
document.body.removeChild(a); |
|||
URL.revokeObjectURL(url); |
|||
}) |
|||
.catch(error => { |
|||
console.error('Error generating report:', error); |
|||
alert('Error generating report'); |
|||
}); |
|||
} |
|||
</script> |
|||
{% endblock %} |
|||
@ -0,0 +1,428 @@ |
|||
{% extends "base.html" %} |
|||
|
|||
{% block title %}Payment Search - Plutus{% endblock %} |
|||
|
|||
{% block head %} |
|||
<style> |
|||
.search-container { |
|||
max-width: 1200px; |
|||
margin: 0 auto; |
|||
} |
|||
|
|||
.search-box { |
|||
background: white; |
|||
border-radius: 8px; |
|||
padding: 2rem; |
|||
box-shadow: 0 2px 4px rgba(0,0,0,0.1); |
|||
margin-bottom: 2rem; |
|||
} |
|||
|
|||
.search-input { |
|||
font-size: 1.2rem; |
|||
padding: 1rem; |
|||
border: 2px solid #dbdbdb; |
|||
border-radius: 6px; |
|||
transition: border-color 0.3s; |
|||
} |
|||
|
|||
.search-input:focus { |
|||
border-color: #3273dc; |
|||
outline: none; |
|||
box-shadow: 0 0 0 3px rgba(50, 115, 220, 0.25); |
|||
} |
|||
|
|||
.search-results { |
|||
background: white; |
|||
border-radius: 8px; |
|||
box-shadow: 0 2px 4px rgba(0,0,0,0.1); |
|||
} |
|||
|
|||
.result-item { |
|||
padding: 1rem; |
|||
border-bottom: 1px solid #f5f5f5; |
|||
transition: background-color 0.3s; |
|||
} |
|||
|
|||
.result-item:hover { |
|||
background-color: #fafafa; |
|||
} |
|||
|
|||
.result-item:last-child { |
|||
border-bottom: none; |
|||
} |
|||
|
|||
.result-header { |
|||
display: flex; |
|||
justify-content: between; |
|||
align-items: center; |
|||
margin-bottom: 0.5rem; |
|||
} |
|||
|
|||
.result-type { |
|||
display: inline-block; |
|||
padding: 0.25rem 0.5rem; |
|||
border-radius: 4px; |
|||
font-size: 0.8rem; |
|||
font-weight: bold; |
|||
text-transform: uppercase; |
|||
} |
|||
|
|||
.result-type.single { |
|||
background-color: #48c774; |
|||
color: white; |
|||
} |
|||
|
|||
.result-type.batch { |
|||
background-color: #3273dc; |
|||
color: white; |
|||
} |
|||
|
|||
.result-details { |
|||
display: grid; |
|||
grid-template-columns: repeat(auto-fit, minmax(200px, 1fr)); |
|||
gap: 1rem; |
|||
margin: 0.5rem 0; |
|||
} |
|||
|
|||
.result-field { |
|||
font-size: 0.9rem; |
|||
} |
|||
|
|||
.result-field strong { |
|||
color: #363636; |
|||
} |
|||
|
|||
.result-actions { |
|||
margin-top: 1rem; |
|||
display: flex; |
|||
gap: 0.5rem; |
|||
flex-wrap: wrap; |
|||
} |
|||
|
|||
.loading { |
|||
text-align: center; |
|||
padding: 2rem; |
|||
} |
|||
|
|||
.no-results { |
|||
text-align: center; |
|||
padding: 3rem; |
|||
color: #666; |
|||
} |
|||
|
|||
.search-tips { |
|||
background-color: #f5f5f5; |
|||
border-radius: 4px; |
|||
padding: 1rem; |
|||
margin-top: 1rem; |
|||
} |
|||
|
|||
.search-tips ul { |
|||
margin: 0; |
|||
padding-left: 1.5rem; |
|||
} |
|||
|
|||
.search-error { |
|||
background-color: #f8d7da; |
|||
border: 1px solid #f5c6cb; |
|||
border-radius: 4px; |
|||
padding: 1rem; |
|||
margin: 1rem 0; |
|||
color: #721c24; |
|||
} |
|||
</style> |
|||
{% endblock %} |
|||
|
|||
{% block content %} |
|||
<div class="search-container"> |
|||
<div class="search-box"> |
|||
<h1 class="title is-2">🔍 Payment Search</h1> |
|||
<p class="subtitle">Search across all payment records by Splynx ID or Payment Intent</p> |
|||
|
|||
<div class="field has-addons"> |
|||
<div class="control is-expanded"> |
|||
<input |
|||
class="input search-input" |
|||
type="text" |
|||
id="searchQuery" |
|||
placeholder="Enter Splynx ID (e.g. 123456) or Payment Intent (e.g. pi_1234567890)" |
|||
autocomplete="off" |
|||
> |
|||
</div> |
|||
<div class="control"> |
|||
<button class="button is-primary is-large" onclick="performSearch()"> |
|||
<span class="icon"> |
|||
<i class="fas fa-search"></i> |
|||
</span> |
|||
<span>Search</span> |
|||
</button> |
|||
</div> |
|||
</div> |
|||
|
|||
<div class="field is-grouped is-grouped-multiline" style="margin-top: 1rem;"> |
|||
<div class="control"> |
|||
<label class="label is-small">Search Type:</label> |
|||
<div class="select is-small"> |
|||
<select id="searchType"> |
|||
<option value="all">Auto-detect</option> |
|||
<option value="splynx_id">Splynx ID</option> |
|||
<option value="payment_intent">Payment Intent</option> |
|||
</select> |
|||
</div> |
|||
</div> |
|||
<div class="control"> |
|||
<label class="label is-small">Results Limit:</label> |
|||
<div class="select is-small"> |
|||
<select id="resultsLimit"> |
|||
<option value="25">25 results</option> |
|||
<option value="50" selected>50 results</option> |
|||
<option value="100">100 results</option> |
|||
</select> |
|||
</div> |
|||
</div> |
|||
<div class="control"> |
|||
<button class="button is-small is-info is-outlined" onclick="clearSearch()"> |
|||
<span class="icon"><i class="fas fa-times"></i></span> |
|||
<span>Clear</span> |
|||
</button> |
|||
</div> |
|||
</div> |
|||
|
|||
<div class="search-tips"> |
|||
<h5 class="title is-6">💡 Search Tips:</h5> |
|||
<ul> |
|||
<li><strong>Splynx ID:</strong> Enter customer ID number (e.g., 123456)</li> |
|||
<li><strong>Payment Intent:</strong> Enter full Stripe Payment Intent ID (e.g., pi_1234567890)</li> |
|||
<li><strong>Auto-detect:</strong> System automatically detects search type based on format</li> |
|||
<li><strong>Results:</strong> Searches both Single Payments and Batch Payments simultaneously</li> |
|||
</ul> |
|||
</div> |
|||
</div> |
|||
|
|||
<div id="searchResults" style="display: none;"></div> |
|||
</div> |
|||
|
|||
<script> |
|||
let searchInProgress = false; |
|||
|
|||
// Perform search on Enter key |
|||
document.getElementById('searchQuery').addEventListener('keypress', function(e) { |
|||
if (e.key === 'Enter') { |
|||
performSearch(); |
|||
} |
|||
}); |
|||
|
|||
// Auto-focus search input |
|||
document.getElementById('searchQuery').focus(); |
|||
|
|||
function performSearch() { |
|||
const query = document.getElementById('searchQuery').value.trim(); |
|||
const searchType = document.getElementById('searchType').value; |
|||
const limit = document.getElementById('resultsLimit').value; |
|||
|
|||
if (!query) { |
|||
showError('Please enter a search query'); |
|||
return; |
|||
} |
|||
|
|||
if (searchInProgress) { |
|||
return; // Prevent multiple simultaneous searches |
|||
} |
|||
|
|||
searchInProgress = true; |
|||
showLoading(); |
|||
|
|||
const params = new URLSearchParams({ |
|||
q: query, |
|||
type: searchType, |
|||
limit: limit |
|||
}); |
|||
|
|||
fetch(`/search/api?${params}`) |
|||
.then(response => response.json()) |
|||
.then(data => { |
|||
searchInProgress = false; |
|||
if (data.success) { |
|||
displayResults(data); |
|||
} else { |
|||
showError(data.error || 'Search failed'); |
|||
} |
|||
}) |
|||
.catch(error => { |
|||
searchInProgress = false; |
|||
console.error('Search error:', error); |
|||
showError('Network error occurred while searching'); |
|||
}); |
|||
} |
|||
|
|||
function showLoading() { |
|||
const resultsDiv = document.getElementById('searchResults'); |
|||
resultsDiv.style.display = 'block'; |
|||
resultsDiv.innerHTML = ` |
|||
<div class="search-results"> |
|||
<div class="loading"> |
|||
<div class="loader"></div> |
|||
<p>Searching payments...</p> |
|||
</div> |
|||
</div> |
|||
`; |
|||
} |
|||
|
|||
function showError(message) { |
|||
const resultsDiv = document.getElementById('searchResults'); |
|||
resultsDiv.style.display = 'block'; |
|||
resultsDiv.innerHTML = ` |
|||
<div class="search-error"> |
|||
<strong>Search Error:</strong> ${message} |
|||
</div> |
|||
`; |
|||
} |
|||
|
|||
function displayResults(data) { |
|||
const resultsDiv = document.getElementById('searchResults'); |
|||
|
|||
if (data.results.length === 0) { |
|||
resultsDiv.innerHTML = ` |
|||
<div class="search-results"> |
|||
<div class="no-results"> |
|||
<h3 class="title is-4">No Results Found</h3> |
|||
<p>No payments found matching "${data.search_query}"</p> |
|||
<p class="has-text-grey">Try searching with a different Splynx ID or Payment Intent</p> |
|||
</div> |
|||
</div> |
|||
`; |
|||
return; |
|||
} |
|||
|
|||
let resultsHtml = ` |
|||
<div class="search-results"> |
|||
<div style="padding: 1rem; border-bottom: 1px solid #f5f5f5; background-color: #f9f9f9;"> |
|||
<h3 class="title is-4">Search Results</h3> |
|||
<p>Found ${data.total_found} payment(s) for "${data.search_query}" (${data.search_type})</p> |
|||
</div> |
|||
`; |
|||
|
|||
data.results.forEach(result => { |
|||
resultsHtml += createResultItem(result); |
|||
}); |
|||
|
|||
resultsHtml += '</div>'; |
|||
resultsDiv.innerHTML = resultsHtml; |
|||
} |
|||
|
|||
function createResultItem(result) { |
|||
const statusClass = result.success === true ? 'has-background-success-light' : |
|||
result.success === false ? 'has-background-danger-light' : |
|||
'has-background-warning-light'; |
|||
|
|||
const statusText = result.success === true ? 'Success' : |
|||
result.success === false ? 'Failed' : |
|||
'Pending'; |
|||
|
|||
const statusIcon = result.success === true ? 'fa-check' : |
|||
result.success === false ? 'fa-times' : |
|||
'fa-clock'; |
|||
|
|||
return ` |
|||
<div class="result-item ${statusClass}"> |
|||
<div class="result-header"> |
|||
<div> |
|||
<span class="result-type ${result.type}">${result.type}</span> |
|||
<strong style="margin-left: 0.5rem;">Payment #${result.id}</strong> |
|||
${result.batch_id ? `<span class="tag is-info is-light">Batch #${result.batch_id}</span>` : ''} |
|||
</div> |
|||
<div class="tags"> |
|||
<span class="tag ${result.success === true ? 'is-success' : result.success === false ? 'is-danger' : 'is-warning'}"> |
|||
<i class="fas ${statusIcon}"></i> ${statusText} |
|||
</span> |
|||
${result.refund ? '<span class="tag is-purple">Refunded</span>' : ''} |
|||
${result.pi_followup ? '<span class="tag is-warning">PI Follow-up</span>' : ''} |
|||
${result.refund_followup ? '<span class="tag is-info">Refund Follow-up</span>' : ''} |
|||
</div> |
|||
</div> |
|||
|
|||
<div class="result-details"> |
|||
<div class="result-field"> |
|||
<strong>Splynx ID:</strong> |
|||
${result.splynx_id ? `<a href="${result.splynx_url}" target="_blank">${result.splynx_id}</a>` : 'N/A'} |
|||
</div> |
|||
<div class="result-field"> |
|||
<strong>Amount:</strong> $${Math.abs(result.amount).toFixed(2)} AUD |
|||
</div> |
|||
<div class="result-field"> |
|||
<strong>Payment Method:</strong> ${result.payment_method || 'N/A'} |
|||
</div> |
|||
<div class="result-field"> |
|||
<strong>Payment Intent:</strong> |
|||
<code style="font-size: 0.8rem;">${result.payment_intent || 'N/A'}</code> |
|||
</div> |
|||
<div class="result-field"> |
|||
<strong>Created:</strong> ${new Date(result.created).toLocaleDateString()} ${new Date(result.created).toLocaleTimeString()} |
|||
</div> |
|||
<div class="result-field"> |
|||
<strong>Processed By:</strong> ${result.processed_by} |
|||
</div> |
|||
</div> |
|||
|
|||
${result.error ? ` |
|||
<div class="notification is-danger is-light" style="margin-top: 0.5rem;"> |
|||
<strong>Error:</strong> ${result.error.substring(0, 200)}${result.error.length > 200 ? '...' : ''} |
|||
</div> |
|||
` : ''} |
|||
|
|||
<div class="result-actions"> |
|||
<a href="${result.detail_url}" class="button is-small is-primary"> |
|||
<span class="icon"><i class="fas fa-eye"></i></span> |
|||
<span>View Details</span> |
|||
</a> |
|||
${result.batch_url ? ` |
|||
<a href="${result.batch_url}" class="button is-small is-info"> |
|||
<span class="icon"><i class="fas fa-layer-group"></i></span> |
|||
<span>View Batch</span> |
|||
</a> |
|||
` : ''} |
|||
${result.splynx_url ? ` |
|||
<a href="${result.splynx_url}" target="_blank" class="button is-small is-link"> |
|||
<span class="icon"><i class="fas fa-external-link-alt"></i></span> |
|||
<span>Splynx Customer</span> |
|||
</a> |
|||
` : ''} |
|||
</div> |
|||
</div> |
|||
`; |
|||
} |
|||
|
|||
function clearSearch() { |
|||
document.getElementById('searchQuery').value = ''; |
|||
document.getElementById('searchType').value = 'all'; |
|||
document.getElementById('resultsLimit').value = '50'; |
|||
document.getElementById('searchResults').style.display = 'none'; |
|||
document.getElementById('searchQuery').focus(); |
|||
} |
|||
|
|||
// CSS for loader |
|||
const style = document.createElement('style'); |
|||
style.textContent = ` |
|||
.loader { |
|||
border: 3px solid #f3f3f3; |
|||
border-top: 3px solid #3498db; |
|||
border-radius: 50%; |
|||
width: 30px; |
|||
height: 30px; |
|||
animation: spin 1s linear infinite; |
|||
margin: 0 auto 1rem; |
|||
} |
|||
|
|||
@keyframes spin { |
|||
0% { transform: rotate(0deg); } |
|||
100% { transform: rotate(360deg); } |
|||
} |
|||
|
|||
.tag.is-purple { |
|||
background-color: #9370db; |
|||
color: white; |
|||
} |
|||
`; |
|||
document.head.appendChild(style); |
|||
</script> |
|||
{% endblock %} |
|||
Loading…
Reference in new issue