/ alexa

Bar Finder: Alexa Skill

I have written about my passion for Alexa skill development a few times and spoken about it in various forms both internally at Cloudreach and at an AWS Meetup and the passion is still there and expanding as Amazon add more functionality to Alexa every couple of weeks.

Although it is a little frustrating that a lot of the device specific features are yet to launch in the UK (drop in, notifications, calling etc), I still like to keep up with what is going on and throw my hat in the ring when something catches my eye.

This time around, it was Amazon's promotion for August - If you publish a skill in August without using a standard template, you can get a free Echo dot and a pair of funky Alexa socks! You had me at free. I am in!

Bar finder

The skill I ended up creating, testing and publishing was Bar Finder. As the name suggests, it will find bars near by to you! Or rather, your Alexa device's location. I added a bunch of variations of this question to the skill, such as “Where can I go to get smashed” or “where can we go to get steaming” - I am open to more suggestions on this variation!

Bar Finder consists of an AWS Lambda function to do all the work (#Serverless) and consists of the following components:

  • AWS Lambda: Main logic processing. Handles all requests and returns appropriate responses
  • Amazon Alexa Location API: Dealing with user location was an interesting area. Especially around permissions.
  • Google Places API: A brilliant API that allows you to get businesses within a radius of a location.
  • AWS S3: Image storage
  • AWS DynamoDB: User info storage (No location information)
  • Python-alexa: My Python based library for Alexa skill development

Putting it all together

With all the pieces of the puzzle ready, I started putting the puzzle together. Using my (awesome) Python framework for Alexa dev, I was quickly able to get the scaffolding together to accept a LaunchRequest (when you start your skill) and route requests for the Nearby intent (picks a bar within in a radius of you) and the HowMany intent (finds bars near you and lists top 3 if there is more than 3 found).

Configuring the skill became a pain for this - I wanted to be able to support as many variations of "where can I go to get a drink" as I could without having to type out each utterance. So I came up with an upgrade to the Alexa framework.

Utterance Generator:
The generator takes in a JSON config file and spits out every variation of an utterance based on the options you give it.

For example:

  "Nearby": [
    "where [can,should] [we,i] go [{DISTANCE},] [for,to] [get a,a,get]
    "where [can,should] [we,i] go [{DISTANCE},for,to] [get a,a,get] [smashed,drunk,drink] that is {[PRICE,DISTANCE]}"
  "HowMany": [
    "[how many,what] [bars,pubs,places] [can you find,are there,is there]",
    "[how many,what] [bars,pubs,places] [can you find,are there,is there] that are {[PRICE,DISTANCE]}"

Spits out:

Nearby where can we go {DISTANCE} for get a smashed
Nearby where can we go {DISTANCE} for get a drunk
Nearby where can we go {DISTANCE} for get a drink
Nearby where can we go {DISTANCE} for get a steaming
Nearby where can we go {DISTANCE} for a smashed
Nearby where can we go {DISTANCE} for a drunk
Nearby where can we go {DISTANCE} for a drink
Nearby where can we go {DISTANCE} for a steaming
Nearby where can we go {DISTANCE} for get smashed
Nearby where can we go {DISTANCE} for get drunk
Nearby where can we go {DISTANCE} for get drink
Nearby where can we go {DISTANCE} for get steaming
Nearby where can we go {DISTANCE} to get a smashed
Nearby where can we go {DISTANCE} to get a drunk
Nearby where can we go {DISTANCE} to get a drink
Nearby where can we go {DISTANCE} to get a steaming

HowMany what places can you find that are {DISTANCE}
HowMany what places are there that are {PRICE}
HowMany what places are there that are {DISTANCE}
HowMany what places is there that are {PRICE}
HowMany what places is there that are {DISTANCE}

Where {PRICE} and {DISTANCE} are slots passed into the skill.

This little addition to Python-Alexa makes utterance generation super quick and you can cover loads of different variations. I hate doing manual things that can be easily scripted!

This output can be added to the Alexa Skill Interaction Model config and you're good to go.

Google Places API

I love this API. I managed to find a great Python module called googlemaps which provides a wrapper around the API and makes using it even easier.

The API is authenticated against with a API Key that you simply pass when accessing and you are good to go. I utilised Lambda's support for Environment variables here to make it simple to keep the key away from my code but easily accessible.

Bar Finder has a little function I wrote that takes in a postcode and returns all the bars near by to said postcode. Check it out:

def find_places(client, postcode, distance=DISTANCE['CLOSE'], price=None):

    postcode = " ".join(postcode.split())
        geocode_results = client.geocode(postcode)[0]
        location  = geocode_results['geometry']['location']
        print("Couldn't use this location")
        return None, None

    args= {
        # open_now=True,

    if price:

        if price >= "3":
            args['min_price'] = "3"
            args['max_price'] = "4"

            args['max_price'] = price

    places = client.places_nearby(

    for i, place in enumerate(places):
        if 'rating' not in place:
            places[i]['rating'] = -1

    return places, location

This returns a JSON page that I can parse for a random item:

places, user_location = find_places(client, " ".join(location['postalCode'].split()), distance, price=price)

                if places:

                    place = Place(random.choice(places),client, get_city(client, location))

Keeping things simple, I was able to get the basic functionality I wanted working pretty quickly.


Once you think you are ready to show the world your Skill(s), you need to submit it to Amazon for certification. This process is very repetitive and although the issues highlighted when they fail the skill are valid, the debug information they provide is very vague. Often, you will submit something that works for you - it will fail for X reason, you fix that fault and submit again and then you fail again for another issue (that existed before you fixed X). This back and forth is a bit tedious and I only hope Amazon don't get fed up with repeat submissions when only small fixes are made at a time.

An example of this were images in response cards. In every way I could, my testing would show that an image I return with the bar that has been picked would show up as expected. If I submit the skill with this feature working as expected, I would fail for an issue of "Image not found" - No context is given to how they are testing (i.e a real device, service simulator or some other tool) - This was impossible to fix as even when I ran the request they made (from looking at the logs), the images worked for me as expected. Simple solution - Remove the feature for now. I want to get those socks. Oh and the Echo 😎.


  • Test everything. In many ways.
    • Try it from your echo, Echosim, the service simulator, your friends echo and so on.
  • Test with different circumstances.
    • With location, try location permissions enabled and then disabled. See what happens if you look for places near a location in a rural area - Do you handle this correctly?
  • Don't give up
    • I nearly did as I was getting bored of the back and forth - I stuck through it and survived!
  • Think about privacy
    • Something I overlooked. If you require user location, you need to submit a valid privacy policy with your skill. I hadn't done this before so it was new to me - I ended up finding this which I repurposed for my own use.


And after a lot of work, trial and error and patience - I finally got my skill published.

You can enable it by asking Alexa to "Enable Bar finder" or check it out here.

What skills are you working on? How are you finding the process and what would you like to see added to the Alexa Dev tool set? Let me know in the comments below

 Neil

Neil Stewart

Neil Stewart

Cloud Systems Developer at Cloudreach. Passion for DevOps and Cloud Infrastructure development. AWS Certified in 5 areas and working towards Chef certification. Apple lover and general tech enthusiast

Read More