Design an algorithm to encode a list of strings to a string. The encoded string is then sent over the network and is decoded back to the original list of strings.
Machine 1 (sender) has the function:
string encode(vector<string> strs) {
// ... your code
return encoded_string;
}Machine 2 (receiver) has the function:
vector<string> decode(string s) {
//... your code
return strs;
}So Machine 1 does:
string encoded_string = encode(strs);and Machine 2 does:
vector<string> strs2 = decode(encoded_string);strs2 in Machine 2 should be the same as strs in Machine 1.
Implement the encode and decode methods.
Example 1:
Input: dummy_input = ["Hello","World"]
Output: ["Hello","World"]
Explanation:
Machine 1:
Codec encoder = new Codec();
String msg = encoder.encode(strs);
Machine 1 ---msg---> Machine 2
Machine 2:
Codec decoder = new Codec();
String[] strs = decoder.decode(msg);Example 2:
Input: dummy_input = [""]
Output: [""]Constraints:
0 <= strs.length < 1000 <= strs[i].length < 200strs[i]contains any possible characters out of256valid ASCII characters.
Follow up: Could you write a generalized algorithm to work on any possible set of characters?
Topics
Recommended Time & Space Complexity
You should aim for a solution with O(m) time for each encode() and decode() call and O(m+n) space, where m is the sum of lengths of all the strings and n is the number of strings.
Hint 1
A naive solution would be to use a non-ascii character as a delimiter. Can you think of a better way?
Hint 2
Try to encode and decode the strings using a smart approach based on the lengths of each string. How can you differentiate between the lengths and any numbers that might be present in the strings?
Hint 3
We can use an encoding approach where we start with a number representing the length of the string, followed by a separator character (let's use # for simplicity), and then the string itself. To decode, we read the number until we reach a #, then use that number to read the specified number of characters as the string.