This is not an official Google product
ML API next talk demos
This repo includes 4 demos from my Google Next talk and Google I/O talk on the Cloud ML APIs. To run the demos, follow the instructions below.
Vision API + Firebase demo
cd
intovision-api-firebase
- Create a project in the Firebase console and install the Firebase CLI
- Run
firebase login
via the CLI and thenfirebase init functions
to initialize the Firebase SDK for Cloud Functions. When prompted, don't overwritefunctions/package.json
orfunctions/index.js
. - In your Cloud console for the same project, enable the Vision API
- Generate a service account for your project by navigating to the "Project settings" tab in your Firebase console and then selecting "Service Accouts". Click "Generate New Private Key" and save the file to your
functions/
directory in a file calledkeyfile.json
:
- In
functions/index.js
replace both instances ofyour-firebase-project-id
with the ID of your Firebase project - Deploy your Cloud Function by running
firebase deploy --only functions
- From the Authentication tab in your Firebase console, enable Twitter authentication (you can use whichever auth provider you'd like, I chose Twitter).
- Run the frontend locally by running
firebase serve
from thevision-api-firebase/
directory of this project. Navigate tolocalhost:5000
to try uploading a photo. After uploading a photo check your Functions logs and then your Firebase Database to confirm the function executed correctly. - Deploy the frontend by running
firebase deploy --only hosting
. For future deploys you can runfirebase deploy
to deploy Functions and Hosting simultaneously.
Speech API Bash demo
cd
intospeech/
- Make sure you have SoX installed. On a Mac:
brew install sox --with-flac
- Create a project in the Cloud console and generate a new API key. Add your API key in
request.sh
- Run the script:
bash request.sh
Natural Language API BigQuery demo
cd
intonatural-language/
- Generate Twitter Streaming API credentials and copy them to
local.json
- Create a Google Cloud project, generate a JSON keyfile, and add the filepath to
local.json
- Create a BigQuery dataset and table with the below schema, add them to
local.json
- Generate an API key and add it to
local.json
- Change line 37 to filter tweets on whichver terms you'd like
- Install node modules:
npm install
- Run the script:
node twitter.js
Natural Language API + Firebase realtime Twitter dashboard demo
cd
intonl-firebase-twitter/
- Create a project in the Firebase console and install the Firebase CLI
cd
into thefrontend/
directory and runfirebase login
andfirebase init
to associate this with the Firebase project you just created. When prompted, don't overwrite existing files. Create a database and hosting project (no Functions).- In your Firebase console, click "Add Firebase to your web app". Copy the credentials to the top of the main.js file
cd
into thebackend/
directory and runnpm install
to install dependencies- Generate a service account for your project by navigating to the "Project settings" tab in your Firebase console and then selecting "Service Accouts". Click "Generate New Private Key" and save this in your
backend/
directory askeyfile.json
- Generate Twitter Streaming API credentials and copy them to
backend/local.json
- Navigate to the Cloud console for our project. Enabled the Natural Language API and generate an API key. Replace
YOUR-API-KEY
inbackend/local.json
with this key. - Replace
searchTerms
inbackend/index.js
with the search terms you'd like to filter tweets on - Replace
FIREBASE-PROJECT-ID
inbackend/local.json
with the id of your Firebase project - Set up BigQuery: in your Cloud console for the same project, create a BigQuery dataset. Then create a table in that dataset. When creating the table, click Edit as text and paste the following:
id:STRING,text:STRING,user:STRING,user_time_zone:STRING,user_followers_count:INTEGER,hashtags:STRING,tokens:STRING,score:STRING,magnitude:STRING,entities:STRING
- Add your BigQuery dataset and table names to
backend/local.json
. - Run the server: from the
backend/
directory runnode index.js
. You should see tweet data being written to your Firebase database - In a separate terminal process, run the frontend: from the
frontend/
directory runfirebase serve
- Deploy your frontend: from the
frontend/
directory runfirebase deploy
Multiple API demo
cd
intovision-speech-nl-translate
- Make sure you've set up your GOOGLE_APPLICATION_CREDENTIALS with a Cloud project that has the Vision, Speech, NL, and Translation APIs enabled
- Run the script:
python textify.py
- Note: if you're running it with image OCR, copy an image file to your local directory