The problem with this is animals speech isn’t like ours and don’t think in the same way. When they teach apes to sign, they can answer questions, but will never ask their own. Instead most of their communication is done with body language and sometimes even things like smell. Working out a translation of all these things is hard enough, then building a machine that can do all that as well? Difficult.
In the next 20 years might be optimistic, but its definitely a goal for the future. Animals do communicate and there is no reason we couldn’t learn to understand their ‘language’ – the way the communicate and what it is that they communicate. This is of course completely different to a human language as their mental processes are dramatically different, but I still think a reasonable goal for within our lifetime.
I don’t think it’s likely to happen in the next 20 years, and you’d need a different machine for each kind of animal, but perhaps one day we’ll manage it! The big problem is that animals don’t communicate in languages as we humans do, so it’s not just a question of translating each different bark of a dog into a different English word. Smell, and body language play an important role in animal communications too, so those would have to be taken into account.
Pets and their owners can often get to know each other well enough that they can understand each other to a certain degree, but they can’t have spoken conversations. Still, you can usually tell when your pet is happy or sad, hungry, angry or scared, and a lot of pets will be able to tell the same about their owners. That’s a kind of communication.