Another way to solve the language issue would be lingering machine learning-enhanced automated translation software (that would be to Google Translate what Fallout 4 is to Pong). This software could be integrated into everyone's hereditary synthetic-biology/nanotech mods so that everybody understands everybody. Or, some know the invocation to activate it from the Astral (i.e. the Cloud) or have working versions in their mods whereas most don't. So, machine translation could work like a spell in D&D or be a special talent that some people have. "I am a Spirit-Speaker. I hear your spirit, however your lips might move." There could also be Translation Amulets left over from some point during the decline and fall, when people had to resort to clunkier interfaces again. The Survivors might have some language difficulties until the currently-existing tech figures out their antiquated language. It could happen quickly (their augments and the surviving nanotech connect and learn each other's languages almost instantly starting from geometry and mathematical concepts, working up to science and figuring it out from there), medium speed (the current systems have to hear enough of the characters' speech to recognize the era they're from and access surviving historical databases) or longer period (the machines have to hear enough of the survivors' speech and charades ["This is a stick. Stiiiiiick. This is a leaf..."] to develop a translation matrix).
The approach used would depend on whether you wanted language barriers to be a frequent, difficult problem, a common but solvable problem ("Find me a Spirit-Speaker, would you?"), or no problem at all ("Huh...their lips are moving funny, like they're talking some other language...but somehow I understand them just fine.").
Just curious, would it be allowable to be a native of the new world instead of one of the "VaultTec survivors?" :)