In this article I want to dive into Localizing your Actions on Google tools to allow us to achieve these things.
What is Localization?
If you’re not already aware of what localisation is, we can look at it as the process of adapting our products for different countries / regions. In the cases of screen based applications this isn’t just about changing the language — a lot of the time you will find that visuals may be portrayed differently in different parts of the world. Whilst this is the case, in this article we are just going to be dealing with a voice-based tool, so we’re going to focus on the language used by our Actions on Google tool. Visually we could achieve localisation with Rich Responses, but we’ll cover that in more detail in another post!
Currently, Actions on Google supports a collection of different locales that can be supported by your tools, these are:
- English (en-US, en-GB, en-AU, en-IN) 🇬🇧
- German (de-DE) 🇩🇪
- French (fr-FR, fr-CA) 🇫🇷
- Japanese (ja-JP) 🇯🇵
- Korean (ko-KR) 🇰🇵
- Spanish (es-ES, es-419) 🇪🇸
- Portuguese (pt-BR) 🇵🇹
- Italian (it-IT) 🇮🇹
- Russian (ru-RU) 🇷🇺
Whilst this is every locale that we could probably think of right now, it’s still enough to be able to provide a better experience for more of your users!
With these locales in mind, DialogFlow enables us to create tools that allow both input from the user and responses from the conversation in these languages. Within our project settings we are able to setup our intents & entities in an individual manner for our desired locales. From here, our users can use our conversational tool with their desired language and our tool will output responses in the same language (provided it is supported!).
Let’s go ahead and take a look at how we can set this up within our projects.
Localised responses from DialogFlow
To begin with, we’re going to add another supported locale to our DialogFlow project. To do so, you need to hit the “+” icon within the project sidebar.
Once you’ve done so, you should be able to see some project setting categories within a tab bar. We should automatically have navigated to the “Languages” section, but if not then you’ll need to select that navigation item. Once here, we can go ahead and select the locale that we wish to support. Here you can see that I have added Italian to my support languages.
Now you have done this, the language would have been added to your project. If you wish to remove any languages from your project in the future then you can do so from this same menu.
At this point you should be able to see the new locale within your project sidebar. When making changes to your project, the changes that you make will be applied to the selected locale. Because my project has already been created for the “en” locale, here I’ve selected “it” so that I can adjust my entities and intents to work as intended for the Italian locale.
We’re going to begin my amending the intent so that Italian users have a personalised experience when using my app. I’m going to begin by opening up the default welcome intent for my project. The text response here for my “en” local is:
Now we are editing the intent for the “it” locale, we can go ahead and customise this for the Italian language:
I also have an intent called learn.chord within my project — so for this, I’m going to go ahead and do the same again for the “it” locale, adjusting the accepted user expressions for the Italian language:
You need to make sure that you adjust all the text for responses, accepted expressions and any other content that acts as a form of input or output for the selected locale in your project. Once done so, that will be enough to satisfy your project to support different locale for logic contained inside of DialogFlow.
However, if you’re using logic from outside of DialogFlow(such as from firebase cloud functions) then you’ll need to add some locale checks inside of this code too.
Localised responses from code
To achieve localised responses from logic within your code there are a couple of different ways to do so, in this article we’re going to look at using the i18n library on npm. You can follow the steps over on the link there to add this package to your project.
In order to use localised strings in our app we’re going to need to provide a different strings file per localisation. We’re going to place these inside a locales directory inside of our actions on google project directory.
Because we’re going to have english and italian localisation in our app, I’ve created a json file for each of these.
Each of these json files contains a KEY to retrieve a locale string, along with a VALUE for the specified key. Here is the content for both my it-IT.json and en-US.json text files:
Now that we have both locale files defined for the strings from within our app, we can go ahead and make use of them in our project! We’re going to need to being by including a reference to i18n and configure it within our .js file:
const i18n = require(‘i18n’);
i18n.configure({ locales: [‘en-US’, ‘it-IT’], directory: __dirname + ‘/locales’, defaultLocale: ‘en-US’ });
As above, during configuration we need to define three things:
- The supported locales for our project. These must match the name of our locale text files.
- The directory in which our locale text files are stored relative to the location of the file we are performing the configuration in.
- The default locale we wish to be used for our project — it’s good to define this as a fallback.
Now that i18n is setup in our project, we can go ahead and make use of it. We’ll need to start by setting the locale that we wish to be used by our i18n instance. Luckily, the Dialogflow sdk provides us with a method that can be used to fetch the locale of the current user. We can then use this to assign the locale to be used by i18n:
const app = new ApiAiApp({ request, response }); i18n.setLocale(app.getUserLocale());
Now that that is setup, we can access our localised strings by using their key identifiers. For example, this would provide us with the localised string for the value stored against the STRING_MUTED key.
i18n.__(‘STRING_MUTED’)
We can then make use of these when presenting text / voice responses to our users:
app.tell(i18n.__(‘ERROR_NOT_FOUND’))
We can also use javascript formatting libraries along with i18n to construct strings that require formatting with string arguments.
const format = require(‘string-format’);
format(i18n.__(‘CHORD_TITLE’), userInput)
format(i18n.__(‘CHORD_DESC’), input, i18n.__(‘WHAT_NEXT’))
Another great thing you could do is to also localise your rich responses. To do so, you could use keys in the locale json files to access the paths for localised images to use for platforms with a screen.
Conclusion
I hope from this post you’ve been able to learn how we can localise our conversations, as well as the benefits it will bring for our users. If you’re setting up localisation, have any questions or comments on this article then please drop me a tweet 🙌🏻