Meta Expands Translation Tools with AI Voice Dubbing
Meta has introduced a new AI voice translation feature for Facebook and Instagram Reels. The tool allows creators to dub videos into another language, starting with English and Spanish, and offers an optional lip-sync adjustment that aligns the translated audio with the speaker’s mouth movements.
The translation tool is available to Facebook creators with at least 1,000 followers and to all public Instagram accounts. Meta confirmed that the rollout covers all regions where its AI products are already active.
How Meta Translate Works
When a creator records a Reel, the tool can automatically translate their spoken words into another language. Creators can preview the dubbed version, toggle lip-sync on or off, and publish the video with translated audio.
For viewers, translated Reels automatically play in their preferred language setting. Each translated video includes a label identifying it as AI-translated, maintaining transparency.
Creators also receive new language-based audience insights, showing how many views came from each translation.
Best Practices for Creators
Meta recommends the following for the best results with AI translation:
-
Record while facing the camera.
-
Keep speech clear and avoid covering the mouth.
-
Minimize background noise.
-
Limit videos to no more than two people speaking at once.
These practices ensure accurate dubbing and smoother lip-sync performance.
Global Impact and Content Accessibility
By introducing AI-powered translations, Meta is positioning itself to expand creator reach across language barriers. A Spanish-speaking creator can now connect with English audiences and vice versa, without the need for third-party dubbing.
This move aligns with Meta’s strategy of using AI to drive global engagement, making Reels more competitive against TikTok and YouTube Shorts.
Translation Error and Public Backlash
Despite the innovation, Meta faced criticism in July 2025 after a faulty Kannada translation wrongly implied that Karnataka Chief Minister Siddaramaiah had died. In reality, he was expressing condolences for another person. The mistranslation went viral, leading to public outrage.
Meta issued an apology and confirmed the error had been corrected. The incident highlighted the risks of AI-powered translation and the need for strong safeguards, especially in sensitive contexts.
What Comes Next for Meta Translate
Currently, the feature only supports English and Spanish. However, reports suggest Meta plans to expand to more languages in phases, targeting Hindi, Portuguese, and other widely spoken languages next.
If successful, this technology could transform how creators and viewers interact with content globally, allowing Reels to become a truly multilingual platform.
Final Outlook
Meta Translate represents a bold step toward making social media more inclusive and accessible. While the Kannada mistranslation incident raised concerns, the new feature shows Meta’s intent to push forward with AI-driven global communication.
If Meta can balance accuracy with scale, this innovation could change how creators build audiences across borders.







and then