From 6f905184567fc7d3eec6e12cb07f037db9b7fa46 Mon Sep 17 00:00:00 2001
From: TYLER CARAZA-HARTER <tharter@cs544-tharter.cs.wisc.edu>
Date: Fri, 21 Feb 2025 22:12:41 -0600
Subject: [PATCH] explain bigdata.py

---
 p3/README.md | 5 ++++-
 1 file changed, 4 insertions(+), 1 deletion(-)

diff --git a/p3/README.md b/p3/README.md
index aeb0493..3ec78b2 100644
--- a/p3/README.md
+++ b/p3/README.md
@@ -186,7 +186,10 @@ columns.
 individual files we upload will fit within that limit, but the total
 size of the files uploaded will exceed that limit.  That's why your
 server will have to do sums by reading the files (instead of just
-keeping all table data in memory).
+keeping all table data in memory).  If you want manually test your
+code with some bigger uploads, use the `bigdata.py` client.  Instead
+of uploading files, it randomly generateds lots of CSV-formatted data
+and directly uploads it via gRPC.
 
 ## Part 4: Locking
 
-- 
GitLab