/ test-production-sync-corrected.sh
test-production-sync-corrected.sh
  1  #!/bin/bash
  2  
  3  echo "๐Ÿ”„ Testing Production Sync Workflow with Different File Sizes"
  4  echo "=============================================================="
  5  
  6  # Set up test directories
  7  TEST_DIR="/tmp/keepsync-production-test"
  8  LOCAL_DIR="$TEST_DIR/local-files"
  9  DOWNLOAD_DIR="$TEST_DIR/downloaded-files"
 10  
 11  # Clean up any existing test
 12  rm -rf "$TEST_DIR"
 13  mkdir -p "$LOCAL_DIR"
 14  mkdir -p "$DOWNLOAD_DIR"
 15  
 16  echo "๐Ÿ“ Created test directories:"
 17  echo "   Local files: $LOCAL_DIR"
 18  echo "   Download test: $DOWNLOAD_DIR"
 19  
 20  # Create test files of different sizes
 21  echo "๐Ÿ“ Creating test files of different sizes..."
 22  
 23  # Small text file (< 1KB)
 24  cat > "$LOCAL_DIR/small-document.txt" << 'EOF'
 25  This is a small text document for testing.
 26  It contains multiple lines of text.
 27  Line 3: Testing special characters: ร รกรขรฃรครฅรฆรงรจรฉรชรซ
 28  Line 4: Numbers and symbols: 123456789 !@#$%^&*()
 29  Line 5: Unicode: ๐Ÿ” ๐Ÿš€ โœ… โŒ ๐Ÿ“ ๐Ÿ“‹
 30  EOF
 31  
 32  # Medium text file (~3KB)
 33  cat > "$LOCAL_DIR/medium-document.txt" << 'EOF'
 34  This is a medium-sized text document for testing the quantum S3 provider.
 35  
 36  Lorem ipsum dolor sit amet, consectetur adipiscing elit. Sed do eiusmod tempor 
 37  incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis 
 38  nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat.
 39  
 40  Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore 
 41  eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, 
 42  sunt in culpa qui officia deserunt mollit anim id est laborum.
 43  
 44  Section 2: Technical Content
 45  ============================
 46  
 47  The quantum-enhanced S3 provider implements several key security features:
 48  
 49  1. Per-chunk encryption with unique derived keys
 50  2. Hardware TPM integration for master key management
 51  3. Secure key derivation using HKDF
 52  4. File integrity verification with SHA256 hashes
 53  5. Obfuscated filename storage for privacy
 54  
 55  Each chunk is encrypted with a unique key derived from:
 56  - Master TPM hardware key (256-bit AES)
 57  - File-specific derivation salt
 58  - Chunk index for uniqueness
 59  - Additional entropy from secure random generator
 60  
 61  This ensures maximum security where even if one chunk key is compromised,
 62  all other chunks remain secure with perfect forward secrecy.
 63  
 64  Section 3: Performance Characteristics
 65  =====================================
 66  
 67  The system handles various file sizes efficiently:
 68  - Small files (< 1KB): Single chunk with padding for S3 transport
 69  - Medium files (1KB - 5MB): Adaptive chunking based on content
 70  - Large files (> 5MB): Parallel processing with optimal chunk sizes
 71  
 72  Transport batching ensures S3 compatibility:
 73  - Logical chunks are batched to meet 5MB minimum
 74  - Padding is added when necessary
 75  - Metadata tracks batch structure for reconstruction
 76  
 77  End of medium document.
 78  EOF
 79  
 80  # Large text file (~7KB)
 81  cat > "$LOCAL_DIR/large-document.txt" << 'EOF'
 82  This is a large text document for testing chunking behavior with bigger files.
 83  
 84  CHAPTER 1: INTRODUCTION TO QUANTUM-ENHANCED STORAGE
 85  ===================================================
 86  
 87  The evolution of cloud storage has reached a critical juncture where traditional
 88  encryption methods are becoming insufficient for protecting sensitive data against
 89  emerging quantum computing threats. This document explores the implementation of
 90  a quantum-enhanced S3 storage provider that addresses these challenges through
 91  innovative security mechanisms.
 92  
 93  Traditional cloud storage systems typically encrypt entire files with a single
 94  key, creating a significant vulnerability: if the encryption key is compromised,
 95  the entire file becomes accessible to attackers. Our quantum-enhanced approach
 96  fundamentally changes this paradigm by implementing per-chunk encryption with
 97  unique derived keys.
 98  
 99  CHAPTER 2: SECURITY ARCHITECTURE
100  ================================
101  
102  The security architecture is built on several foundational principles:
103  
104  2.1 Hardware TPM Integration
105  ---------------------------
106  At the core of our security model lies the Trusted Platform Module (TPM), a
107  hardware security chip that provides:
108  - Secure key generation using hardware random number generators
109  - Protected key storage that cannot be extracted
110  - Cryptographic operations performed in secure hardware
111  - Attestation capabilities for system integrity verification
112  
113  The TPM serves as the root of trust for all cryptographic operations. A master
114  256-bit AES key is generated and stored securely within the TPM hardware,
115  ensuring that even with administrative access to the system, the master key
116  cannot be extracted or compromised.
117  
118  2.2 Secure Key Derivation
119  -------------------------
120  From the master TPM key, we derive unique keys for different purposes using
121  the HMAC-based Key Derivation Function (HKDF) as specified in RFC 5869:
122  
123  - File-specific keys: Derived using the file path as context
124  - Chunk-specific keys: Derived using file key + chunk index
125  - Metadata keys: Derived using file path + "metadata" context
126  
127  This hierarchical key derivation ensures that:
128  - Each chunk has a completely unique encryption key
129  - Compromise of one key does not affect others
130  - Keys can be regenerated deterministically when needed
131  - Perfect forward secrecy is maintained
132  
133  2.3 Per-Chunk Encryption
134  ------------------------
135  The revolutionary aspect of our approach is the per-chunk encryption mechanism:
136  
137  1. Files are first divided into logical chunks (before encryption)
138  2. Each chunk receives a unique derived encryption key
139  3. Chunks are encrypted independently using AES-256-GCM
140  4. Encrypted chunks are then batched for efficient transport
141  
142  This approach provides several critical security advantages:
143  - Granular security: Each chunk is independently protected
144  - Reduced blast radius: Compromise of one chunk key affects only that chunk
145  - Parallel processing: Chunks can be encrypted/decrypted concurrently
146  - Integrity verification: Each chunk has its own authentication tag
147  
148  CHAPTER 3: IMPLEMENTATION DETAILS
149  =================================
150  
151  The system employs an adaptive chunking strategy that considers file size,
152  content characteristics, network conditions, and storage provider requirements.
153  
154  For small files (< 1KB), a single chunk is created but still encrypted with
155  a unique derived key to maintain consistency. Medium files (1KB - 5MB) are
156  chunked based on content analysis, while large files (> 5MB) use fixed-size
157  chunks optimized for parallel processing.
158  
159  Transport batching ensures S3 compatibility by combining logical chunks into
160  transport batches that meet the 5MB minimum requirement, with metadata tracking
161  which logical chunks are contained in each batch.
162  
163  CONCLUSION
164  ==========
165  
166  The quantum-enhanced S3 provider represents a significant advancement in
167  cloud storage security, providing protection against both current and
168  future threats while maintaining high performance and compatibility with
169  existing storage infrastructure.
170  
171  End of large document - total size approximately 7KB.
172  EOF
173  
174  echo "โœ… Created test files:"
175  echo "   ๐Ÿ“„ small-document.txt ($(wc -c < "$LOCAL_DIR/small-document.txt") bytes)"
176  echo "   ๐Ÿ“„ medium-document.txt ($(wc -c < "$LOCAL_DIR/medium-document.txt") bytes)"
177  echo "   ๐Ÿ“„ large-document.txt ($(wc -c < "$LOCAL_DIR/large-document.txt") bytes)"
178  
179  # Change to KeepSync directory
180  cd /home/master/.local/share/containers/storage/volumes/coding/_data/coding/go-lang/keepSync
181  
182  # Test 1: Upload files using sync command
183  echo ""
184  echo "๐Ÿ”„ Test 1: Uploading files using KeepSync sync command..."
185  echo "======================================================="
186  
187  # First, let's check if we need to configure S3
188  echo "๐Ÿ”ง Checking S3 configuration..."
189  if [ ! -f "$HOME/.keepsync/config.json" ]; then
190      echo "๐Ÿ“ Creating S3 configuration..."
191      mkdir -p "$HOME/.keepsync"
192      cat > "$HOME/.keepsync/config.json" << 'EOF'
193  {
194    "providers": {
195      "s3": {
196        "bucket": "storage-a01",
197        "region": "us-east-1",
198        "endpoint": "https://s3.filebase.com",
199        "access_key": "EF4A740258F43842F16E",
200        "secret_key": "ZUXkT90Fg8LTdC8QzrEnMSSubldd7eKsRyylukRD",
201        "key_name": "s3-quantum-key-v3"
202      }
203    }
204  }
205  EOF
206      echo "โœ… S3 configuration created"
207  else
208      echo "โœ… S3 configuration already exists"
209  fi
210  
211  # Upload files using sync
212  echo "๐Ÿ“ค Syncing local files to S3..."
213  ./bin/keepsync sync "$LOCAL_DIR" "s3://storage-a01/test-sync/" --verbose
214  
215  # Test 2: Download files back to verify round-trip
216  echo ""
217  echo "๐Ÿ“ฅ Test 2: Downloading files back using sync command..."
218  echo "====================================================="
219  
220  echo "๐Ÿ“ฅ Syncing files back from S3..."
221  ./bin/keepsync sync "s3://storage-a01/test-sync/" "$DOWNLOAD_DIR" --verbose
222  
223  # Test 3: Verify content integrity
224  echo ""
225  echo "๐Ÿ” Test 3: Verifying content integrity..."
226  echo "========================================"
227  
228  echo "๐Ÿ” Comparing small-document.txt..."
229  if [ -f "$DOWNLOAD_DIR/small-document.txt" ] && diff "$LOCAL_DIR/small-document.txt" "$DOWNLOAD_DIR/small-document.txt" > /dev/null; then
230      echo "โœ… small-document.txt: Content matches perfectly"
231  else
232      echo "โŒ small-document.txt: Content mismatch or file missing!"
233      ls -la "$DOWNLOAD_DIR/"
234  fi
235  
236  echo "๐Ÿ” Comparing medium-document.txt..."
237  if [ -f "$DOWNLOAD_DIR/medium-document.txt" ] && diff "$LOCAL_DIR/medium-document.txt" "$DOWNLOAD_DIR/medium-document.txt" > /dev/null; then
238      echo "โœ… medium-document.txt: Content matches perfectly"
239  else
240      echo "โŒ medium-document.txt: Content mismatch or file missing!"
241  fi
242  
243  echo "๐Ÿ” Comparing large-document.txt..."
244  if [ -f "$DOWNLOAD_DIR/large-document.txt" ] && diff "$LOCAL_DIR/large-document.txt" "$DOWNLOAD_DIR/large-document.txt" > /dev/null; then
245      echo "โœ… large-document.txt: Content matches perfectly"
246  else
247      echo "โŒ large-document.txt: Content mismatch or file missing!"
248  fi
249  
250  # Test 4: Verify file readability
251  echo ""
252  echo "๐Ÿ“– Test 4: Verifying file readability..."
253  echo "======================================="
254  
255  if [ -f "$DOWNLOAD_DIR/small-document.txt" ]; then
256      echo ""
257      echo "๐Ÿ“„ small-document.txt (first 3 lines):"
258      head -n 3 "$DOWNLOAD_DIR/small-document.txt"
259  else
260      echo "โŒ small-document.txt not found"
261  fi
262  
263  if [ -f "$DOWNLOAD_DIR/medium-document.txt" ]; then
264      echo ""
265      echo "๐Ÿ“„ medium-document.txt (first 5 lines):"
266      head -n 5 "$DOWNLOAD_DIR/medium-document.txt"
267  else
268      echo "โŒ medium-document.txt not found"
269  fi
270  
271  if [ -f "$DOWNLOAD_DIR/large-document.txt" ]; then
272      echo ""
273      echo "๐Ÿ“„ large-document.txt (first 5 lines):"
274      head -n 5 "$DOWNLOAD_DIR/large-document.txt"
275  else
276      echo "โŒ large-document.txt not found"
277  fi
278  
279  # Test 5: File size verification
280  echo ""
281  echo "๐Ÿ“ Test 5: File size verification..."
282  echo "==================================="
283  
284  echo "๐Ÿ“ Original vs Downloaded file sizes:"
285  if [ -f "$DOWNLOAD_DIR/small-document.txt" ]; then
286      echo "   small-document.txt: $(wc -c < "$LOCAL_DIR/small-document.txt") โ†’ $(wc -c < "$DOWNLOAD_DIR/small-document.txt") bytes"
287  else
288      echo "   small-document.txt: $(wc -c < "$LOCAL_DIR/small-document.txt") โ†’ MISSING bytes"
289  fi
290  
291  if [ -f "$DOWNLOAD_DIR/medium-document.txt" ]; then
292      echo "   medium-document.txt: $(wc -c < "$LOCAL_DIR/medium-document.txt") โ†’ $(wc -c < "$DOWNLOAD_DIR/medium-document.txt") bytes"
293  else
294      echo "   medium-document.txt: $(wc -c < "$LOCAL_DIR/medium-document.txt") โ†’ MISSING bytes"
295  fi
296  
297  if [ -f "$DOWNLOAD_DIR/large-document.txt" ]; then
298      echo "   large-document.txt: $(wc -c < "$LOCAL_DIR/large-document.txt") โ†’ $(wc -c < "$DOWNLOAD_DIR/large-document.txt") bytes"
299  else
300      echo "   large-document.txt: $(wc -c < "$LOCAL_DIR/large-document.txt") โ†’ MISSING bytes"
301  fi
302  
303  echo ""
304  echo "๐ŸŽ‰ Production Sync Workflow Test Complete!"
305  echo "=========================================="
306  
307  # Show what files are in the download directory
308  echo "๐Ÿ“ Files in download directory:"
309  ls -la "$DOWNLOAD_DIR/"
310  
311  # Cleanup
312  echo "๐Ÿงน Cleaning up test directories..."
313  rm -rf "$TEST_DIR"
314  
315  echo "โœ… All tests completed!"