With Tana’s meeting notetaker, you can capture everything said in your meetings - no bots needed. It’s the fastest, most seamless way to record conversations and build your knowledge base from meetings in Tana.
If you have back-to-back meetings all day, it can be a pain to take notes and follow up on all the action items. This is where Tana can help.
Tana now offers a fully integrated AI meeting notetaker as part of the desktop experience. You can use it to transcribe meetings, generate summaries, and link notes directly to your work - without needing to install a plugin or add a bot.
If you set up the Google Calendar integration, the Today page in Tana will be updated with all your meetings.
Tana Desktop will also remind you with meeting notifications when the meeting starts, so you can simply click to capture audio and get an auto-generated summary directly in Tana.
After the meeting you can go through suggested action items and tag the ones you want to flow to your Task dashboard.
Tana offers several features designed to make it easy to take meeting notes:
The Google Calendar integration:
Automatically imports your calendar events to your Today page.
Applies supertags to the events classified as meetings, so they're ready to record and take notes.
Will show a daily agenda calendar with your meetings on Today.
Tana Desktop can send you meeting notifications.
Tana meeting notetaker:
Lets you record and transcribe your meetings in real time, directly from Tana Desktop - no bot required.
It captures system audio, meaning it records not just your voice, but also everyone else’s - perfect for virtual calls, live streams, and shared audio. The transcription will try separate speakers with best effort, but since it does not have access to each speakers's audio, it may not give perfect results where there is similar voices or poor audio quality.
This feature is ideal for situations where inviting a bot isn’t possible or preferred, and you still want to capture everything in one place.
The meeting agent (in-call bot):
If you are not able to join a meeting in person, or need to join from mobile, Tana offers a meeting agent that you can add to your calls.
It will show up in the meeting as a bot, provide a transcript of what each participant is said, create a summary and suggest action items.
This is recommended in situations where it's important with high accuracy in speaker detection.
Task management:
Tana offers a great way to take meeting notes and follow up on action items from meetings, since you can build a workflow for task management that works for you.
Tag any action item from a meeting with #Task (or the task tag you use), and it will flow to your task dashboard.
Click the #Task tag to see all your tasks, and customize views that make it easy for you to stay on top of things.
The Calendar integration and meeting transcriptions are two intertwined features that are meant to work together, to enable you to capture meeting notes with one click.
The calendar integration is not just a convenient way to see your events on the day node. It is an opportunity for AI to help build out your knowledge graph:
It syncs your meetings to Tana so you have a place to prepare for meetings, write notes and agenda points beforehand
The Tana Desktop app can send you notifications about upcoming meetings, allowing you transcribe meetings and get an auto-generated summary.
All notes are saved in the meeting, and you can use AI chat with the meeting notes, asking questions about what was said in the meeting or extracting information with custom prompts and commands.
This feature is ideal for situations where inviting a bot isn’t possible or preferred, and you still want to capture everything in one place.
No bots to manage: No external participants to add to your calls, just seamless recording.
Effortless: Start transcription with a click from your meeting notification.
Tool agnostic: Works with any tool or platform you’re using: Zoom, Meet, Teams, and beyond.
Cross-platform: Available on Mac and Windows with the Tana Desktop app.
Supports 61 languages: Tana auto-detects what’s being spoken and can dynamically switch between languages during the conversation. Full list of languages.
Tana captures everything you hear from system audio, including other people speaking and shared audio like videos.
It will capture audio from others, even if you’re wearing headphones.
See the transcript appear in real time as the meeting happens, Shift+click on the "Transcript" button to open the live transcript in a panel next to the meeting notes.
Scroll back if you missed something that was said.
The transcription will try separate speakers with best effort, but since it does not have access to each speakers's audio, it may not give perfect results where there is similar voices or poor audio quality.
Take your own notes in the main node if you want to. These will be included as context if you use AI chat on the node afterwards.
After the meeting: automatic summary and timestamped transcript
Stop the transcription when the meeting is done, and Tana will automatically create a summary, and suggest action items.
Review and tag action items, assign them to yourself or others.
Create references to the transcript or tag things directly in the transcript.
If you are an existing user who want to set up live transcription on one or more existing supertags, see the full documentation here.
Tana also offers an in-call bot that you can add to scheduled calls. This requires more setup for each call than the Meeting notetaker, so is only recommended for situations where:
You want high accuracy on speaker separation in the transcript - the meeting agent will identify speakers based on the audio streams in the call, while the Meeting notetaker will make a best effort based on differences in voice (but does not have access to separate audio streams).
If you need to join the meeting on mobile, and still want automatic meeting notes and a summary.
When the Meeting agent creates notes, it does the following:
Writes a concise, interconnected meeting summary: Your meeting summaries are no longer siloed on a different AI transcription platform just for meeting notes. They now live in Tana with the rest of your knowledge, connected to things in the transcript and in your graph.
Creates action items based on the conversation: It identifies tasks that come out of a meeting, so you can tag them with #Task with one click.
ImprovedBetter transcription model which supports 60 most common languages, very good at recognizing language switching and automatically choosing correct language. Used both for live transcription in app and from mobile capture. Non-supported languages in app fall back to Whisper (if you set it with Set default transcription language command). ()
FixedFixed transcription formatting and improved accuracy for Chinese language input. ()
NewWe now detect meetings in Telegram, WhatsApp, Signal and FaceTime ()
ImprovedAdded a notification if we don’t detect speech in 30 seconds with AI meeting notetaker turned on ()
FixedWe no longer send notifications for meetings that are in trash ()
ImprovedScheduled meeting notifications now open meetings and start transcription automatically ()
ImprovedMore refinements to bot-less meetings to make audio quality less flaky, stop crashing when using multiple monitors, and use much less CPU in some cases ()
ImprovedUpdated default prompt for Text Processing Agent to achieve better summary results for most meetings. We recommend tweaking your summary prompt for better results for specific meeting types ()
ImprovedMeeting notifications will now stick around to at least two minutes into the meeting before being auto-dismissed ()
FixedFixed a bug where the audio command would override compact menu items when no commands where set in full menu ()
InfoChanged default model for Text Processing Agent to o4-mini ()
InfoWe will now extract action items with the text processing agent as long as there is a "Action item target". If you want them tagged as well, configure the "Tags to use for action items" as well ()
ImprovedSome cleanup of text processing agent parameters: removed some deprecated ones and reorganized the most important ones up top. No change for already configured commands. ()
FixedFixed issue where applying a tag suggestion didn't properly initialize fields ()
FixedFixed issue with switching languages when real-time transcribing losing transcription context ()
Yes, Tana now offers a fully integrated bot-less AI meeting notetaker as part of the desktop experience. You can use it to transcribe meetings, generate summaries, and link notes directly to your work - without needing to install a plugin or add a bot.
How do I change the tags used in the meeting agent and calendar integration?
Dec 02, 2024
Supertags are created or assigned as part of the setup for your meeting agent/calendar integration. Here's a quick guide on how to change these tags post-setup.
Supertags that are used during calendar sync are listed under Classification of events in the Google Calendar settings:
Go to Account settings > Settings > Google Calendar settings > Classification of events
For more on what the different event types mean, see here.
When you change a tag, it needs to use the same fields that are set up for the calendar integration, otherwise the information will get lost. To check which fields are used during sync, go to Google Calendar settings > Advanced > Raw configuration:
The raw configuration defines all the fields that Tana will sync specific calendar event information to.
Supertags are used when processing the transcript to create meeting notes. These supertags are set during setup and can be changed by going into the command node that controls the meeting agent and the transcription process.
First, you must navigate to the text processing agent command for your meetings. For instructions on how to find this, go here and scroll up: Meeting agent > Text processing agent command
You should now have the Text processing agent config open. Here you will see parameters like "Tags used for X", which is the list of supertags used when processing meetings.
To change the tags, you simply add or replace tags from these lists.
However, there are several things to consider when you add/replace from these lists:
Study existing tag: Open the config for the tag you want to replace, and study how it's set up. Try and identify dependencies like specific fields that are also used in the meeting config. Try and make sure your replacement tag replaces as much of the original one as possible.
Set Base type: Make sure the new supertag you add has the appropriate Base type defined, and that it's suitable for the type of object it is:
Items typically use base tags: Project, Topic, Event
Entities typically use base tags: Person, Organization, Location
Action items typically use base tag Task
Decommission the old tag: Make sure the tag you're no longer using is merged with the one you're using, or deleted, to prevent clutter and confusion
There are likely more things to consider, let us know if we missed any via Rate this article 👍👎.
Does the GPT log monitor show AI credits usage by the Meeting agent?
Sep 12, 2024
Only partially; the transcription of the meeting does not use OpenAI and is therefore not logged in the GPT log monitor.
Meetings use a service called Soniox and we are working with them to expose the AI credit cost of individual meetings in a simple way, but currently it's not possible.
The post-processing of the meeting transcript is however done using OpenAI and will be visible in the GPT log monitor.
How can I revoke a meeting agent from a meeting I'm not hosting?
Sep 05, 2024
We are looking into creating a button that will make the agent leave a meeting if you press it. Until we have this in place, it is possible to force the agent to stop transcribing and kickstart the meeting notes processing if necessary.
To do this:
Run the command Debug node on the meeting node
Set the Meeting bot status to Done
The agent should now have stopped transcribing, and the text processing agent should start running.