nimbuscode.dev/mentors/projects/michael-houghton/terminal-tools
C:\> cat PROJECTS/MICHAEL_HOUGHTON/TERMINAL_TOOLS.md
Loading project details...

Terminal-based Developer Tools

By Michael Houghton
Rust Python CLI

Project Overview

This collection of terminal-based developer tools showcases how to build efficient command-line interfaces that enhance developer productivity. These tools were built with a focus on performance, ease of use, and solving real developer pain points.

The tools demonstrate different approaches to CLI development across two languages: Rust for performance-critical tools and Python for rapid prototyping and extensibility.

Code Examples

1. File Watcher (Rust)

A fast file system watcher that triggers custom actions when files change.

use notify::{Config, RecommendedWatcher, RecursiveMode, Watcher};
use std::path::Path;
use std::sync::mpsc::channel;
use std::time::Duration;

/// A simple file watcher that executes a callback when files change
pub struct FileWatcher {
    watcher: RecommendedWatcher,
    path: String,
}

impl FileWatcher {
    pub fn new(path: &str) -> Result {
        // Create a channel to receive events
        let (tx, rx) = channel();
        
        // Create a watcher with default config
        let config = Config::default()
            .with_poll_interval(Duration::from_secs(1));
        let watcher = RecommendedWatcher::new(tx, config)?;
        
        Ok(Self {
            watcher,
            path: path.to_string(),
        })
    }
    
    pub fn watch(&mut self, callback: fn(&str)) -> Result<(), notify::Error> {
        // Start watching the specified path
        self.watcher.watch(Path::new(&self.path), RecursiveMode::Recursive)?;
        
        // Set up the event handling loop
        println!("Watching for changes in {}", self.path);
        
        // Process events
        loop {
            match rx.recv() {
                Ok(event) => {
                    match event {
                        Ok(event) => {
                            println!("Change detected: {:?}", event);
                            callback(&self.path);
                        }
                        Err(error) => println!("Error: {:?}", error),
                    }
                }
                Err(error) => {
                    println!("Watch error: {:?}", error);
                    break;
                }
            }
        }
        
        Ok(())
    }
}

fn main() {
    let args: Vec = std::env::args().collect();
    let path = args.get(1).unwrap_or(&String::from(".")).as_str();
    
    let watcher = FileWatcher::new(path).expect("Failed to create watcher");
    
    // Define a callback to execute when files change
    let callback = |path: &str| {
        println!("Running tests for {}", path);
        let output = std::process::Command::new("cargo")
            .arg("test")
            .output();
        
        match output {
            Ok(output) => {
                println!("Test results: {}", String::from_utf8_lossy(&output.stdout));
            }
            Err(e) => {
                println!("Failed to run tests: {}", e);
            }
        }
    };
    
    // Start watching
    watcher.watch(callback).expect("Failed to watch");
}

2. Log Parser (Python)

A tool for extracting and analyzing patterns from log files.

import re
import argparse
from collections import Counter
from datetime import datetime
import sys
from typing import List, Dict, Any

class LogParser:
    def __init__(self, log_format=None):
        # Default log format for common log formats
        self.log_format = log_format or {
            "timestamp": r"(\d{4}-\d{2}-\d{2} \d{2}:\d{2}:\d{2},\d{3})",
            "level": r"(INFO|WARNING|ERROR|DEBUG|CRITICAL)",
            "message": r"(.*)"
        }
        self.patterns = {
            "ip_address": r"\b\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3}\b",
            "email": r"\b[A-Za-z0-9._%+-]+@[A-Za-z0-9.-]+\.[A-Z|a-z]{2,}\b",
            "url": r"https?://(?:[-\w.]|(?:%[\da-fA-F]{2}))+",
            "error_code": r"Error code: (\d+)"
        }
    
    def parse_line(self, line: str) -> Dict[str, Any]:
        # Create a regex pattern from the log format
        pattern = f"{self.log_format['timestamp']}\\s+{self.log_format['level']}\\s+{self.log_format['message']}"
        match = re.match(pattern, line)
        
        if not match:
            return {"raw": line}
        
        result = {
            "timestamp": match.group(1),
            "level": match.group(2),
            "message": match.group(3).strip()
        }
        
        # Extract additional patterns from the message
        message = result["message"]
        for name, pattern in self.patterns.items():
            matches = re.findall(pattern, message)
            if matches:
                result[name] = matches
        
        return result
    
    def parse_file(self, file_path: str) -> List[Dict[str, Any]]:
        results = []
        
        try:
            with open(file_path, 'r') as f:
                for line in f:
                    if line.strip():
                        parsed = self.parse_line(line.strip())
                        results.append(parsed)
        except Exception as e:
            print(f"Error parsing file {file_path}: {e}", file=sys.stderr)
        
        return results
    
    def analyze(self, logs: List[Dict[str, Any]]) -> Dict[str, Any]:
        # Count occurrences of log levels
        levels = Counter([log.get("level", "UNKNOWN") for log in logs if "level" in log])
        
        # Count unique IPs
        ip_addresses = []
        for log in logs:
            if "ip_address" in log:
                ip_addresses.extend(log["ip_address"])
        
        # Analyze error patterns
        error_logs = [log for log in logs if log.get("level") == "ERROR"]
        error_codes = []
        for log in error_logs:
            if "error_code" in log:
                error_codes.extend(log["error_code"])
        
        return {
            "total_logs": len(logs),
            "level_counts": dict(levels),
            "unique_ips": len(set(ip_addresses)),
            "error_count": len(error_logs),
            "common_error_codes": dict(Counter(error_codes).most_common(5))
        }

def main():
    parser = argparse.ArgumentParser(description="Parse and analyze log files")
    parser.add_argument("file", help="Log file to analyze")
    parser.add_argument("-o", "--output", help="Output file (default: stdout)")
    args = parser.parse_args()
    
    log_parser = LogParser()
    logs = log_parser.parse_file(args.file)
    analysis = log_parser.analyze(logs)
    
    output = [
        f"Log Analysis Report - {datetime.now().strftime('%Y-%m-%d %H:%M:%S')}",
        f"File: {args.file}",
        f"Total log entries: {analysis['total_logs']}",
        "\nLevel distribution:"
    ]
    
    for level, count in analysis["level_counts"].items():
        output.append(f"  {level}: {count}")
    
    output.extend([
        f"\nUnique IP addresses: {analysis['unique_ips']}",
        f"Total errors: {analysis['error_count']}",
        "\nTop error codes:"
    ])
    
    for code, count in analysis["common_error_codes"].items():
        output.append(f"  Error {code}: {count} occurrences")
    
    result = "\n".join(output)
    
    if args.output:
        with open(args.output, 'w') as f:
            f.write(result)
    else:
        print(result)

if __name__ == "__main__":
    main()

Key Learning Points

  • Language-specific strengths: Rust's performance and memory safety vs. Python's rapid development and extensibility
  • CLI design patterns: Building intuitive command-line interfaces with proper argument parsing and help documentation
  • Error handling approaches: Different approaches to error handling in systems programming vs. scripting languages
  • Cross-platform considerations: Ensuring tools work consistently across different operating systems
  • Performance optimization: Techniques for making tools efficient even when processing large amounts of data

Implementation Notes

These tools demonstrate several important programming concepts:

  1. Rust File Watcher: Uses the notify crate to monitor file system events, showcasing Rust's ownership model and error handling with Result types.
  2. Python Log Parser: Demonstrates regular expressions, file handling, and data analysis with a focus on readability and extensibility.

The code examples are intentionally structured to illustrate best practices in each language, with an emphasis on:

  • Proper error handling and reporting
  • Clean code organization and documentation
  • Reusable components and clear abstractions
  • Efficient resource usage