Send an AI phone call with a custom objective and actions.
Formatting Examples
"You are {name}, a customer service agent at {company} calling {name} about {reason}.Unused Parameters
task - The pathway substitutes as the agent’s instructions.model - We use our own fine-tuned models under the hood.tools - Replaced by ‘Webhook’ Node in Pathwaystransfer_list - Replaced by ‘Transfer Call’ Node in Pathwaystransfer_phone_number - Replaced by ‘Transfer Call’ Node in Pathwayspathway_id is provided.Example Simple Request body:voice: "maya"voiceaura Curated voices:mayamasonryanadrianatinamattevelynMoving from `voice_id` to `voice`
voice_id or reduce_latency in your request is still supported, but not recommended.The previous structure to select voices used both voice_id and reduce_latency. To simplify the process, we’ve combined these into a single voice parameter.voice are RL, that is equivalent to settings reduce_latency to true.HQ will use the high fidelity version of the voice.voice_id from before.reduce_latency: true, voice_id: 0 is equivalent to voice: "RL0"reduce_latency: false, voice_id: 3 is equivalent to voice: "HQ3"reduce_latency may override the voice parameter, so exclude it when using voice.Moving from `voice_preset_id` to `voice`
voice parameter and can use either the preset name or ID.If you used to have a voice_preset_id of "2f9fdbc7-4bf2-4792-8a18-21ce3c93978f", you can now use voice: "2f9fdbc7-4bf2-4792-8a18-21ce3c93978f".null - Default, will play audible but quiet phone static.office - Office-style soundscape. Includes faint typing, chatter, clicks, and other office sounds.cafe - Cafe-like soundscape. Includes faint talking, clinking, and other cafe sounds.restaurant - Similar to cafe, but more subtle.none - Minimizes background noisewait_for_greeting is set to true, the agent will wait for the call recipient to speak first before responding.true, the AI will not respond or process interruptions from
the user."0.9", "0.3", "0.5"array of objects that guides the agent on how to say specific words. This is great for situations with complicated terms or names.Object Parameters
word — the word you want to guide the LLM on how to pronouncepronunciation — the word you want to guide the LLM on how to pronounce.case_sensitive — whether or not to consider case. Particularly useful with names. EG: ‘Max’ the name versus ‘max’ the word. Defaults to false. Not required.spaced — whether or not to consider spaces. When true the word ‘high’ would be flagged but NOT ‘hightop’. Defaults to true. Not required.Prompting Notes
task, refer to the action solely as “transfer” or “transferring”.transfer_phone_number if a transfer_list.default is specified.Will default to transfer_list.default, or the chosen phone number.Example usage to route calls to different departments:Expand to view language options
en - Englishen-US - English (US)en-GB - English (UK)zh - Chinese (Mandarin, Simplified)zh-CN - Chinese (Mandarin, Simplified, China)zh - Hans - Chinese (Mandarin, Simplified, Hans)zh - TW -Chinese (Mandarin, Traditional)zh - Hant - Chinese (Mandarin,Traditional, Hant)es - Spanishfr - Frenchde - Germanhi - Hindija - Japaneseko - Koreanpt - Portugueseit - Italiannl - Dutchpl - Polishru - Russiansv - Swedishda - Danishid - Indonesianms - Malaytr - Turkishrequest_data to:Properties
url: The URL of the external API to fetch data from.response_data: An array of objects describing how to parse and use the data fetched from the API. Explained in more detail below.The following are optional:method: Allows GET or POST. Default: GETcache: Whether to fetch the data once at the beginning of the call, or to re-check continuously for data that might change mid-call. Default: trueheaders: An object of headers to send with the request.body: The body of the request.The following variables can be injected into the dynamic request body:{{from}} (Ex. +12223334444){{to}}{{short_from}} (Ex. 2223334444){{short_to}}{{call_id}}dynamic_data[].body where they’re used by system values in each request.Try out with this example:name: A label for the fetched data.Example: "Flight Status"data: The JSON path in the API response to extract the data from.Example: "user.flights[0].status"context: How this data should be incorporated into the AI’s knowledge.Example: "John's flight is currently {{Flight Status}}"YYYY-MM-DD HH:MM:SS -HH:MM (ex. 2021-01-01 12:00:00 -05:00).The timezone is optional, and defaults to UTC if not specified.Note: Scheduled calls can be cancelled with the POST /v1/calls/:cal\l_id/stop endpoint.amd is set to true or voicemail_action is set to ignore, then this will still work for voicemails, but it will not hang up for IVR systems.hangupleave_messageignorevoicemail_message is set, that message will be left and then the call will end.amd is set to true, the AI will navigate the system and continue as normal.voicemail_action is set to ignore, the AI will ignore the IVR and continue as normal.voicemail_message is set then it’ll leave that message and end the call.voicemail_message is set, then the AI will leave the message regardless of the voicemail_action.wait (integer): The delay in seconds before the call is retried.voicemail_action (enum): The action to take when the call goes to voicemail. Options: hangup, leave_message, ignore.voicemail_message (string): The message to leave when the call goes to voicemail.max_duration minutes. At the end of that timer, if the call is still active it will be automatically ended.Example Values: 20, 2true. When your call completes,
you can access through the recording_url field in the call details or
your webhook.queuecalllatencywebhooktooldynamic_datametadata.Example:summary is generated based on the transcript - you can use this field to add extra instructions and context for how it should be summarized.For example: "Summarize the call in French instead of English."analysis_schema.analysis.For example, if you wanted to retrieve this information from the call:answered_by field with the value human, unknown or voicemail.Notes for accuracy:answered_by is voicemail or human, that is nearly 100% accurate.unknown, try using text analysis by adding answered_by to your analysis_schema.success or error.success).success).