C# – How to use FileSystemWatcher

You can use the FileSystemWatcher class to detect file system changes, such as when a file is created, deleted, modified, or renamed. When a change happens, it raises an event that you can handle. This is an event-based alternative to polling for file changes.

In this article, I’ll show how to use FileSystemWatcher to detect and process new files and modified files. FileSystemWatcher has some problematic behavior, which I’ll explain and show how to solve as part of the steps.

1 – Create and configure the FileSystemWatcher

Here’s an example of creating and configuring the FileSystemWatcher to watch for .txt file changes in a directory:

using System.IO;

var watcher = new FileSystemWatcher(@"C:\temp\", filter: "*.txt");

watcher.Changed += (_, e) => Console.WriteLine($"File changed {e.FullPath}");
watcher.Created += (_, e) => Console.WriteLine($"File created {e.FullPath}");
watcher.EnableRaisingEvents = true;

Console.ReadLine();
Code language: C# (cs)

Note: The filter parameter allows you to specify which types of files to monitor. It’s optional and by default it watches for changes in all files in the directory.

Running this and creating a new file results in the following output:

File created: C:\temp\movies.txt
File changed: C:\temp\movies.txt
File changed: C:\temp\movies.txtCode language: plaintext (plaintext)

Notice that it fired multiple events. This is known, intentional behavior:

Common file system operations might raise more than one event. For example, when a file is moved from one directory to another, several OnChanged and some OnCreated and OnDeleted events might be raised

Microsoft – FileSystemWatcher API documentation

In step 3, I’ll show how you can prevent processing the same file change repeatedly.

2 – Process file change events with a concurrent queue

FileSystemWatcher stops sending file change events when its internal buffer is full. This results in you missing file changes. There’s two things you can do to solve this problem:

  1. Increase the buffer size with the InternalBufferSize property. This has a default of 8 KB and a max of 64 KB.
  2. Process the file changes with a concurrent queue.

I recommend using a concurrent queue – such as a BlockingCollection (blocking queue) or a Channel (async queue). In the Changed/Created event handlers, add the file change info (such as the file path) to the queue. Then process the queue in a loop. This makes the event handlers exit as quickly as possible, which minimizes how much buffer space is being used.

Here’s an example of how to do this:

using System.Collections.Concurrent;
using System.IO;

var queue = new BlockingCollection<string>();

var watcher = new FileSystemWatcher(@"C:\temp\", filter: "*.txt");

//Enqueue changed file info
watcher.Changed += (_, e) => queue.Add(e.FullPath);
watcher.Created += (_, e) => queue.Add(e.FullPath);
watcher.EnableRaisingEvents = true;

//Process changed files as they are enqueued
while (!queue.IsCompleted)
{
    var filePath = queue.Take(); //blocking dequeue

    if (File.Exists(filePath))
        Console.WriteLine($"Processed file: {filePath}. Content={File.ReadAllText(filePath)}");

}
Code language: C# (cs)

Running this code and creating a new file results in the following output:

Processed file: C:\temp\movies.txt. Content=hello world
Processed file: C:\temp\movies.txt. Content=hello world
Processed file: C:\temp\movies.txt. Content=hello worldCode language: plaintext (plaintext)

As mentioned in step 1, Created/Changed fire multiple times. This results in the file path getting enqueued and processed three times. We’ll solve that problem next.

3 – Prevent processing the same file change multiple times

FileSystemWatcher fires multiple events when a file is created or modified (as shown in step 1 and step 2 above). This is a problem because it can lead to processing the file multiple times when you really only want to process it once.

To solve this problem, you can keep track of files you’ve processed and their modified dates with a concurrent dictionary. When you see a file again and its modified date hasn’t changed, it means this is a duplicate event that can be ignored.

I’ll show how to implement this. At this point, it makes sense to encapsulate all of this behavior (FileSystemWatcher initialization, queue processing loop, and the duplicate event detection logic) in a class. Here’s a class with all this behavior:

using System.Collections.Concurrent;
using System.IO;

public class FileChangeProcessor
{
    private FileSystemWatcher watcher;
    private BlockingCollection<string> queue;
    private ConcurrentDictionary<string, DateTime> processedFileMap;
    public FileChangeProcessor()
    {
        watcher = new FileSystemWatcher(@"C:\temp\", filter: "*.txt");
        queue = new BlockingCollection<string>();
        processedFileMap = new ConcurrentDictionary<string, DateTime>();

        watcher.Changed += (_, e) => queue.Add(e.FullPath);
        watcher.Created += (_, e) => queue.Add(e.FullPath);
    }

    public void StartProcessingFileChanges()
    {
        //Start watcher
        watcher.EnableRaisingEvents = true;

        //Start consuming queue
        while (!queue.IsCompleted)
        {
            var filePath = queue.Take(); //Blocking dequeue
            var fileInfo = new FileInfo(filePath);

            if (!fileInfo.Exists)
                continue;

            if (processedFileMap.TryGetValue(filePath, out DateTime processedWithModDate))
            {
                if (processedWithModDate == fileInfo.LastWriteTimeUtc)
                {
                    Console.WriteLine($"Ignoring duplicate change event for file: {filePath}");
                    continue;
                }

                //It's a new change, so process it, then update mod date.
                Console.WriteLine($"Processed file again: {filePath}");
                processedFileMap[filePath] = fileInfo.LastWriteTimeUtc;
            }
            else
            {
                //We haven't processed this file before. Process it, then save the mod date.
                Console.WriteLine($"Processed file for the first time: {filePath}.");
                processedFileMap.TryAdd(filePath, fileInfo.LastWriteTimeUtc);
            }
        }
    }
}
Code language: C# (cs)

Here’s a demo of using this code:

var fileChangeProcessor = new FileChangeProcessor();
Task.Run(() => fileChangeProcessor.StartMonitoringChanges());

Console.WriteLine("Creating file");
File.WriteAllText(@"C:\temp\movies.txt", "hello");
Console.ReadLine();

Console.WriteLine("Updating file");
File.WriteAllText(@"C:\temp\movies.txt", "hello world");
Console.ReadLine();
Code language: C# (cs)

This outputs the following:

Creating file
Processed file for the first time: C:\temp\movies.txt.
Ignoring duplicate change event for file: C:\temp\movies.txt

Updating file
Processed file again: C:\temp\movies.txt
Ignoring duplicate change event for file: C:\temp\movies.txtCode language: plaintext (plaintext)

Notice that it was able to detect the duplicate events and ignore them.