TL;DR: Most EVV demos show you the clean path - a caregiver clocks in, the visit completes, the record submits. What they don't show you is what happens when GPS doesn't match, a caregiver forgets to clock out, a reason code needs to be added, or the aggregator rejects a record. This checklist walks one real visit end-to-end - schedule through clock-in, documentation, exception handling, edit audit trail, aggregator submission, and billing-ready output - so you can pressure-test compliance readiness, caregiver adoption, and office workflow before you sign anything. Ankota's EVV is built into the same platform that handles scheduling, visit documentation, and billing, so the exceptions your office handles each morning are already in the same system where they get resolved - not imported from a separate tool.
Before getting into compliance mechanics and scoring categories, there's a single test that tells you more than any feature checklist. Ask the vendor to follow one visit through its entire lifecycle without switching screens or jumping to slides.
The sequence is: schedule and service plan created, caregiver clocks in using the primary method, visit location and service captured near real time, caregiver documents tasks and notes if your state requires it, caregiver clocks out, exceptions triggered and resolved, manual entry or edit with reason codes and attestation if needed, audit trail reviewed, submission to state aggregator or payer workflow where applicable, and billing-ready output with units, hours, and claims workflow visible.
If the vendor can't do that live and in sequence, walk away. The inability to demonstrate this end-to-end flow under demo conditions is a preview of what your coordinators will deal with every day. For a broader look at what separates strong EVV platforms from the rest, our EVV software comparison guide covers the evaluation criteria in detail.
The Demo That Looked Right and Wasn't:A pattern that shows up consistently in the EVV market is software that was built around one agency's workflow or hardcoded for one aggregator's file format - and then never updated when the rules changed. The major aggregators in the market - HHA Exchange, Sandata, Tellus/Netsmart, Therap, and Authenticare - are sophisticated and demanding, and they evolve. A platform that was configured to produce "the Oregon flat file" in 2017 and hasn't been touched since is not the same as a platform that actively maintains aggregator connections as requirements shift.
The real test of a vendor's aggregator readiness isn't what they show you in the demo. It's what happens the next time your state announces new data elements or changes what it's checking for. Ask the vendor directly: how many aggregator updates did they push in the past 12 months, and what's their process when a state requirement changes mid-year? A vendor who can answer that specifically is doing very different work from one who points you to a general compliance statement on their website.
A surprising number of EVV demos gloss over compliance mechanics entirely. They show you the mobile app, the dashboard, and the reporting screen, and they assume that passes for a compliance review. It doesn't.
Multiple visit elements - date, start time, end time, recipient, worker, and service type - are expected to be captured near real time. The practical question is where the timestamp comes from and what happens when it's wrong.
Ask the vendor to show you the timestamp source: is it device time or server time? What happens if the phone clock is wrong? What's the exact workflow if a caregiver starts a visit late and clocks in 20 minutes after arrival? These aren't edge cases - they happen every week in home care operations, and a platform that handles them cleanly is structurally different from one that doesn't.
Location capture at visit start and end is a compliance requirement, not a nice-to-have feature. The two standard models - Home and Community - each have different operational implications, and your state's rules may specify which is required and when.
Ask the vendor to show you how location is captured and stored, whether their system supports your state's required location model, and what caregivers see when they're at a location that doesn't match the expected address. That last scenario - a client who's temporarily at a family member's home, or a community-based visit at a location the system hasn't seen before - is where weak platforms create exception backlogs.
This is where most demos fail. Manual visit entry isn't a workaround - it's a required capability with specific compliance documentation attached. Any manual visit should require reason codes explaining why it wasn't captured electronically, an attestation confirming the visit occurred, and a complete audit trail showing who entered it, when, and why.
Ask the vendor to enter a manual visit live during the demo. Ask them to show you the reason code list - specifically, whether it meets your state's required taxonomy. Ask them to show the attestation step and where it's stored. Then ask them to show the audit trail for that entry. If any of those steps require the rep to check with someone or navigate to a screen they haven't shown before, that's meaningful information about how the system was designed.
The compliance requirement for visit edits mirrors the one for manual entry: reason codes, attestations, and full audit trails are required both before and after submission. There's an additional layer when direct care workers have the ability to edit their own visit records - approval controls must exist so that caregiver-initiated edits go through a supervisor before they're final.
Ask the vendor to show a caregiver requesting an edit and what the supervisor approval workflow looks like. Then ask to see what appears in the audit trail for that post-submission change. An immutable audit trail is what makes your EVV records defensible. If the system allows changes but doesn't preserve the 'version history' of those changes in a permanent, unalterable log, it won't survive a rigorous audit. .
The Audit That Tested Their Records:
What an auditor actually wants to see is one place with everything they need, navigable in the sequence they're checking. The baseline comes from the 21st Century Cures Act requirements: client name and client ID, caregiver name and caregiver state ID, service type with the relevant CPT code, exact date and time of arrival and departure, the method used for each separately, and the backup verification data - GPS coordinates, telephony records, or FOB codes depending on your model. That's the floor, not the ceiling.
Beyond that floor, auditors vary. Some want task documentation. Some want caregiver and client signatures. Some want canceled visit records and the reason codes attached to them. The agencies that get through audits quickly - we regularly hear from centers where the auditor wrapped everything in under an hour - are the ones whose EVV records are organized in one place with every required element surfaced without navigation or manual assembly. That outcome isn't an accident. It reflects a platform that was built with audit readiness as a design requirement, not an afterthought. The question worth asking in your demo is simple: show me what an auditor sees when they request a visit record, and show me where the edit history lives if they ask for it.
The most sophisticated compliance engine in the world fails if caregivers can't use the app reliably in the field. A demo that only shows admin screens is not a complete demo.
Have the vendor complete a full visit from the caregiver side with you timing how many taps it takes: open the app, find the client, start the visit, confirm the service or task, add a quick note if required, end the visit, and confirm with signature or attestation if your state requires it. If the demo flow is clunky or requires the rep to narrate around confusing steps, that's exactly what your caregivers will experience - and they won't have a rep narrating. For a practical guide on what good caregiver adoption looks like in EVV rollouts, our article on preparing caregivers to adopt EVV covers the change management side in detail.
Home care visits happen in basements, rural areas, and buildings with no signal. Any vendor that can't clearly explain their offline behavior is telling you they haven't thought seriously about real-world field conditions.
Ask the vendor to turn on airplane mode and complete a visit. Ask what data is stored locally on the device, what happens if the app crashes mid-visit, and how conflicts are resolved when the device syncs after connectivity is restored. An honest answer - even one that acknowledges limitations - is more useful than a polished non-answer. Note that the app should capture the timestamp and location locally and sync it once reconnected, rather than just saying "what data is stored."
Beyond the clean demo scenario, ask the vendor to walk through the situations that happen constantly in real operations: a caregiver who forgets to clock out, a caregiver who clocks in at the wrong client, an address that's inaccurate in the system, a service type that changes mid-visit, a last-minute caregiver swap, and a community-based visit at a location that isn't the client's home. Each of these creates an exception. The question is whether the system handles it with a defined workflow or whether it creates a problem your coordinator has to untangle manually.
Most EVV vendors will tell you their system "supports exceptions." What that usually means varies enormously. Some platforms have a genuine exceptions workflow with dashboards, routing, SLA tracking, and audit evidence attached to each resolution. Others surface exceptions as a list that your team sorts through manually every morning.
The operational reality is that exceptions will happen every single day in a home care operation at any scale. Missing clock-outs, unauthorized services, unknown recipients, visits without in-call or out-call records - these aren't rare events. They're the daily work of exception resolution, and the cost of that work adds up in coordinator time and delayed billing.
Ask the vendor to show you a realistic exceptions scenario during the demo. "Pretend it's Monday morning and we have 30 exceptions. Show us exactly how we clear them." What you're looking for is a daily exceptions dashboard with filters by coordinator, supervisor, or team; clear "next action" guidance on whether the caregiver or the office resolves each type; time-to-resolution tracking so exceptions don't age unnoticed; and audit evidence attached to each resolution. If the vendor shows you a flat list with no routing logic, that's your answer.
The Exception Backlog That Was Drowning the Team:
One of the more difficult situations we encounter is the agency that didn't keep pace with EVV compliance, often because they assumed their vendor was managing it on their behalf. When those calls come in, the ask is usually some version of: "Can we get fully live this week, and can you help us clean up the last two months of visits?"
The honest answer is that recovery is possible, but it comes with a real cost. Visits that weren't captured electronically in the required timeframe get reported as manual, non-EVV visits. That's not a good look with a state auditor, and there's no retroactive fix that changes the record. What does work is moving from that baseline to full compliance quickly and cleanly, so that the trend line the state sees is one of rapid correction rather than ongoing non-compliance. The agencies that recover best are the ones that treat the cleanup as a defined project with a clear end date, not an ongoing background task. That's exactly the scenario where your exception workflow - and your vendor's willingness to help you work through it - gets tested in ways that the demo never will.
A verified visit is not a paid visit. The gap between EVV capture and a clean, billing-ready claim is where a lot of agencies lose money and time, and it's consistently under-tested in demos.
The questions to ask are practical: What export format does the system produce, and does it match what your billing team actually uses? How are units calculated and rounded, and can you show us the logic? Where does the system show the reconciliation between scheduled visits, delivered visits, and billed visits? And for visits with edits or exceptions in their history - the "risky" ones - is that documentation attached to the claim export?
Ask the vendor to show you the billing export file live, and ask them to show you how the system identifies visits that are likely to be rejected before they're submitted. First-pass acceptance rates are a measurable outcome of EVV implementation - a platform that helps you improve them is doing different work than one that just captures visit data.
EVV training is often a documented requirement, not just a vendor service. Agencies typically need proof of training completion, and in consumer-directed models the responsibility split between the employer of record, the fiscal management services agency, and the employee adds another layer of complexity.
Before signing anything, ask the vendor for a training plan by role - caregiver, coordinator, supervisor, and billing - not a single onboarding session that combines everyone. Ask to see the training artifacts: quick-start guides, videos, quizzes, whatever they provide. Ask how training completion is tracked and how you produce documentation of it for compliance purposes. And if you operate any self-direction or FMS programs alongside traditional home care, ask specifically how responsibility for training changes in that model.
The risk of a vendor demo is leaving with a good feeling that doesn't survive contact with implementation. A structured scoring approach forces specificity and makes it easier to compare vendors after multiple demos.
We suggest scoring across seven categories: compliance and audit trail quality covering reason codes, attestations, and edit history; caregiver usability covering speed, clarity, and offline reliability; exceptions workflow covering dashboards, routing logic, and resolution tracking; aggregator and payer readiness where relevant to your state model; billing readiness covering exports, reconciliation, and denial prevention; training and change management covering materials, role-based delivery, and completion tracking; and implementation and support covering go-live timeline, SLA commitments, and what happens when something goes wrong on a Friday afternoon.
Rate each category on a consistent scale, require your team to document the specific evidence that drove each score, and use the same scorecard across every vendor you evaluate. Good vibes are not a procurement decision.
Ankota is built for home and community-based care organizations managing real operational complexity - scheduling, EVV, compliance, and Medicaid billing workflows that need to stay connected rather than operating as separate systems that someone reconciles manually.
The workflow continuity that matters most in an EVV context is the connection between visit verification and everything downstream. A visit that's verified should flow cleanly into exception resolution, supervisor approval, audit documentation, and billing-ready output without creating a manual reconciliation project at each handoff. When those steps are disconnected - when EVV lives in one system and billing lives in another and someone exports data between them - the exception workload compounds at every junction.
In an Ankota demo, the scenario worth seeing is a low-signal or forgotten clock-out: the caregiver works in an area with spotty coverage and doesn't clock out on time. The system flags the exception, captures the reason code and attestation, routes it to supervisor approval, preserves a complete audit trail, and does all of that without holding up billing. That's the workflow continuity that makes EVV an operational asset rather than a compliance burden.
If you're evaluating EVV platforms, request a workflow-first Ankota demo tailored to your state model and agency operations. We'll run through the end-to-end visit sequence and provide a demo scorecard you can reuse across every vendor you're comparing.
What is an EVV demo checklist and why do I need one? An EVV demo checklist is a structured set of scenarios and questions that forces a vendor to demonstrate compliance mechanics, caregiver usability, exceptions handling, and billing readiness live - rather than in a scripted product tour. Without one, demos tend to show you the best-case scenario. A checklist tests what happens when things go wrong, which is where platforms diverge most in practice.
What compliance requirements should I verify in an EVV demo? At minimum, verify that the system captures all required data elements near real time, supports your state's location model, requires reason codes and attestations for manual visits and edits, maintains an immutable audit trail for all changes, and routes caregiver edit requests through supervisor approval before finalizing. These aren't optional features - they're the mechanics that determine whether your EVV records are defensible in an audit.
How should I test caregiver usability during an EVV demo? Ask the vendor to complete a visit from the caregiver's perspective with you counting the taps. Then ask them to demonstrate offline mode by turning on airplane mode and completing a visit. Then walk through the edge cases caregivers encounter weekly: missed clock-outs, wrong client clock-ins, community-based visits, and last-minute caregiver swaps. If any of those scenarios require workarounds or the rep needs to improvise, that's what your caregivers will experience in the field.
What questions should I ask about EVV exceptions handling? Ask the vendor to show you a realistic morning exceptions scenario - 20 or 30 exceptions sitting in the queue - and walk through exactly how they're cleared. You're looking for role-based routing so the right person resolves each type, "next action" guidance at the exception level, time-to-resolution tracking, and audit evidence attached to each resolution. A flat list with no routing logic is a coordinator time sink.
How do I evaluate EVV billing readiness in a demo? Ask the vendor to show you the actual export file your billing team would use, the unit calculation and rounding logic, and the reconciliation between scheduled, delivered, and billed visits. Ask specifically how the system flags visits with edits or exceptions in their history - the ones most likely to generate denials - before they're submitted. First-pass acceptance rate is the metric that matters here.
What should I ask about EVV training and implementation support? Ask for a training plan by role rather than a single onboarding session, and ask to see the actual training materials rather than a description of them. Ask how completion is tracked and documented for compliance purposes. For implementation, ask for a specific go-live timeline, what the parallel-run period looks like, and what the support response model is for urgent issues after go-live - especially for problems that surface on a Friday payroll run.
Ankota's mission is to enable the Heroes who keep older and disabled people living at home to focus on care because we take care of the tech. If you need software for home care, EVV, I/DD Services, Self-Direction FMS, Adult Day Care centers, or Caregiver Recruiting, please Contact Ankota.
And if you're ready to see how the most innovative agencies are using AI to empower their caregivers and automate the rest, meet your new companion at www.kota.care.