cat app.log | grep ERRORUse a pipeline to send command output into another command for filtering or transformation.
Core Linux text processing commands and shell patterns for searching, filtering, editing, formatting, and analyzing text files and streams.
Common text processing building blocks and pipeline patterns.
cat app.log | grep ERRORUse a pipeline to send command output into another command for filtering or transformation.
grep ERROR app.logThis is more efficient than using `cat file | grep ...` for most cases.
journalctl -u nginx | awk '{print $1, $2, $3}'Most Linux text tools are designed for stream processing and work especially well in pipelines.
Write output to a file while still passing it downstream.
grep ERROR app.log | tee errors.txt | wc -l`tee` is useful when debugging pipelines or capturing intermediate results.
ps aux
| grep python
| grep -v grep
| awk '{print $2, $11}'Long shell pipelines become much easier to read and maintain when formatted over multiple lines.
Pick lines, ranges, and context quickly.
head -n 20 server.logUseful for verifying file structure before applying heavier text processing.
tail -n 50 server.logOften used with logs to inspect the most recent events.
tail -f /var/log/nginx/access.logUse `-f` to stream appended data in real time.
sed -n '100,120p' file.txt`sed -n` suppresses default output so only the requested range is printed.
awk 'NR>=100 && NR<=120' file.txtAwk's `NR` variable tracks the current line number.
Normalize spacing, wrap text, and clean lines.
tr -s ' ' < input.txtGood for cleaning space-delimited text before further parsing.
Remove spaces and tabs from the start of each line.
sed 's/^[[:space:]]*//' file.txtCharacter classes make the expression portable across whitespace types.
sed 's/[[:space:]]*$//' file.txtUseful before committing scripts, YAML, or source files.
fmt -w 72 README.txt`fmt` is useful for quickly wrapping plain text to a target width.
fold -w 80 long-lines.txtUnlike `fmt`, `fold` wraps mechanically rather than reflowing paragraphs.