Story
In real telecommunication, errors occur on a bit level, that's why we should learn how to deal with them. Now, instead of doing one error every three bytes, the program on this stage should make an error in one bit per byte. So, every byte would be corrupted, but in a small way.
For example, let's take the symbol 'A'. In binary, it would be 01000001. If we make an error in a single bit, it would look something like 01010001 or 00000001 or 11000001, making it a completely different symbol, but in binary very close to the symbol 'A'. By the way, you can see all binary and hexadecimal representations of the standard symbols here.
Objective
In this stage, you should write a program that reads the text the user wants to send from the send.txt, and simulate the sending through a poor internet connection making one-bit errors in every byte of the text. Notice that this text is no longer a string since after manipulations in every byte it may happen to be that some bytes didn't correspond to a specific character in the table because Java does not use ASCII table representation in their String implementation. Java uses UNICODE that happens to match with ASCII in the first 128 symbols, but no further. The String class is too complicated for low-level manipulations so you should use bytes or chars instead.
The received message which contains an error in a single bit in every byte should be saved into received.txt.
Also to work with this file in a hex format, you need a DeltaHex Editor. It's a plugin for IntelliJ IDEA.