So first of all, what do you do at Trackingplan?

I work at Customer Success, managing the relationship with clients to help them get value from the product in their day-to-day. I also try to detect needs and risks with them before they can turn into a friction point.

Normally, this includes:

  • Onboarding and training (training sessions, best practices, and guidance to ensure configuration and usage are consistent).
  • Driving adoption and results.
  • Proactive account management (anticipating tracking changes, new needs, and potential friction, and proposing next steps).

What’s a day in the life of Ángel like?

Basically, I start the day by reviewing the latest client installations/configurations to detect possible incidents as early as possible and help them resolve them. I also check some dashboards that we know usually concentrate a high volume of incidents, to assess their severity and prioritize.

If the incidents are minor (for example, specification changes or small configuration adjustments), I handle them directly to improve the client’s data quality. The goal of this whole process is for the client to receive only alerts that are truly relevant for their day-to-day, reducing “noise” and making it easier for them to act quickly when it really matters.

You spent years working as a digital analyst. How does that background influence the way you work with customers today?

Coming from an ecosystem very similar to my clients’ and having that dual perspective, technical and marketing, makes a clear difference in my work. It allows me to put myself in their shoes quickly, understand the real context they operate in (pressure for results, technical dependencies, campaign timelines, implementation constraints), and detect needs much faster. That understanding speeds up decision-making and makes working together more efficient to reach their goals.

A key part of my contribution is acting as a “bridge” between profiles that often only partially understand each other. I usually say I’m able to talk to a technical team and convey precisely what the marketing team needs (the “why,” business impact, priorities, and timing). At the same time, I can translate the technical language to a marketing team: clearly explaining what can be improved, what needs adjusting, and what the current limitations are, without losing rigor but avoiding unnecessary jargon.

Thanks to that translation ability in both directions, I reduce friction and misunderstandings, align expectations from the start, and help the client move faster: priorities are clearer, execution requires fewer iterations, and data quality improves with concrete actions, not just diagnoses.

I’m sure you often find yourself in situations with clients that feel very familiar. Do you think your background helps you empathize more and connect faster when you recognize a problem because you’ve been there yourself?

Absolutely. Having gone through similar problems helps me empathize faster and connect sooner because I immediately recognize both the technical pattern and the “weight” it usually carries for the team.

In practice, this translates into two very concrete advantages:

  • I can name the problem earlier: When a client describes somewhat vague symptoms (“the events are weird,” “the conversions don’t add up,” “suddenly there are duplicates”), having experienced similar cases allows me to turn that description into reasonable hypotheses and organize the chaos: what is most likely, what I would discard first, and which signals I would look for. This reduces friction and speeds up diagnosis.
  • I ask better questions from minute one: Instead of staying generic, I go straight to the questions that usually unblock the situation: “Did anything change in the container or CMP?”, “Did a second trigger fire?”, “Are there differences between staging and production?”, “What changed since the day it started?”. The client notices that you’re not just listening, you understand how this breaks in the real world.

There’s also an important human component: when you’ve been there, it’s easier to convey that sense of “I get it.” It’s not just that you know how to solve it, it’s that you validate the situation, which usually builds trust faster. That trust makes the client share critical context sooner, collaborate better (logs, access, recent changes), and be more willing to follow an action plan, which improves cooperation during resolution.

With all this background, what’s your approach to solving client challenges?

I try to solve client challenges through proactive management and leverage my technical background to act as a bridge between teams. If I had to summarize it in three points:

  • Act as a “bridge” and translator: Thanks to my previous experience, I have a dual technical and marketing perspective. This allows me to translate business needs to the technical team (explaining impact and priorities) and, conversely, explain technical limitations or adjustments to the marketing team without using unnecessary jargon.
  • Fast diagnosis based on experience (“having been there”): Having worked in their ecosystem, I understand the pressure and timelines they manage. I can name problems early (vague symptoms like “conversions don’t add up” can be turned into reasonable hypotheses because I’ve seen those patterns).
  • Also, asking the right questions—the ones that are specific technical issues like changes in the TMS, environment differences, etc.—usually unblocks the solution.
  • Proactivity and noise reduction: I don’t wait for the client to report a failure. I start my day reviewing installations to detect incidents before they become serious problems. If I detect minor incidents, I handle them myself to ensure data quality. My goal is to reduce “noise” so that the client only receives alerts that are truly relevant and critical to their business.

In reality, I try to use technical empathy to validate the client’s situation, build trust quickly, and ensure we work on real problems efficiently.

What lessons from past experiences guide your work today, shaping the way you try to deliver top-quality support and care to your customers at Trackingplan?

I try to be, above all, honest with the client: “yes to everything” is not a valid answer. Many clients want to hear an immediate yes, but from experience, I know that not setting technical boundaries from the start ends up creating friction and discussions when they ask for things the tool can hardly solve (now or in the future).

That’s why my goal is to manage expectations from minute one: clearly explaining what can be done, what cannot, and why. That said, I try to make that “no” come with a “yes, and…”: “it can’t be done this way, but we can do it like this,” proposing concrete alternatives (workarounds, focus changes, or reprioritization) so the client isn’t left stuck.

I’ve also learned not to promise on the spot. I prefer to say “let me validate it” and come back with a solid answer after gathering context, checking assumptions, and, if needed, reproducing the case. This avoids unrealistic expectations and protects trust.

When something isn’t viable, I like to close with a clear commitment on the next step: what I’m going to review, what options I’ll propose (usually 2, with pros/cons), what risks or trade-offs exist (time, stability, data quality), and the timeframe in which I’ll share it. And if it requires product or engineering, I escalate responsibly with a clear brief (impact, urgency, and evidence), being transparent with the client about probabilities and timing.

In short, I don’t want to be just the “friendly” face that always says yes; I prefer to be the person who tells the truth, proposes a viable path, and protects the long-term relationship with clarity and transparency.

What’s your favorite part of building relationships with clients?

What motivates me the most is earning trust with clarity and empathy, and using that trust to execute faster and with less friction.

For me, building relationships with clients is when it stops being “support” and becomes real collaboration, where both sides feel they are moving in the same direction.

I especially like the moment when the client goes from telling you a loose symptom to sharing the real context: what concerns them, what deadlines they have, what internal dependencies exist, and what business impact is behind it. When you reach that level of trust, you can help much better: you ask the right questions from the start, align expectations honestly, and put a clear plan on the table.

At Trackingplan, I enjoy the pace: you can turn conversations into decisions quickly. I love being the “bridge” between teams (marketing, data, product, engineering), translating priorities without jargon, and looking for quick wins that build credibility. Those small accumulated improvements are what end up building a solid long-term relationship.

Can you share a memorable client win or interaction—perhaps a fun, surprising, or particularly impactful moment that has stuck with you?

One interaction that’s really stayed with me started with a pretty tense message from a client: “Legal is asking questions—are we 100% sure our site isn’t tracking people before they consent?”

Instead of going into a vague “we’ll investigate” loop, I asked for three super practical things: what CMP they were using and what changed recently (banner config, vendor list, Consent Mode setup), whether it was happening everywhere or only in certain regions/environments, and the exact date/time they first noticed it. From there, it got clear fast: they had a consent signal mismatch, so some tags were effectively behaving like “sure, go ahead” before the user had actually made a choice, while other tags were being blocked even after people opted in.

The fix wasn’t magic, it was just getting the basics in the right order: make sure there’s a proper default consent state set early (usually “denied” until the user interacts), then only “update” that state after the user actually takes an action in the banner. That sequencing is exactly what Consent Mode is designed for, and GTM even has a Consent Initialization trigger meant to run before other tags so you can avoid that “tags firing too early” situation.

What made it a win wasn’t just “we fixed it,” but the shift in vibe: we went from a stressful, blamey conversation to a calm, testable checklist—open the site, walk through a few journeys, confirm the consent state changes, confirm tags respect it, done. And the client basically said, “Thank you—this is the first time this has felt under control.”

Outside work, what hobbies or side projects help you recharge?

Outside work, I’m quite simple: I like spending time with my family and friends, I’m very much into calm plans and being with my people. I’m the kind who prefers consistency over showiness, and that also helps me disconnect and recharge, because in a startup, you normally go fast and have many conversations in parallel.

And then I have a habit that I keep quite constant: reading. I alternate fiction with non-fiction, and in non-fiction, I usually read things related to my work (analytics, tracking, product, communication), because I like learning and applying ideas day-to-day. In the end, I try to focus my hobbies on activities that bring me balance and curiosity, without overcomplicating things.

How do you think your work as a digital analyst would have been different if you’d had Trackingplan back then? Does it feel rewarding to know that tools like this are helping bring more recognition and value to a profession that’s often overlooked?

If I had had Trackingplan when I worked as a digital analyst, my day-to-day would have been quite different: less putting out fires after the fact and more proactive control. Instead of finding out about problems late, when a campaign has already launched, or reporting doesn’t add up, I could have detected tracking changes earlier, prioritized what really impacts the business, and reduced noise from minor incidents so the team could focus on what matters.

Internal collaboration would also have changed a lot. A big part of analytics work is acting as a bridge between marketing and technical teams, and a tool like this helps transform vague complaints like “conversions don’t add up” into concrete hypotheses and clear actions. That reduces iterations, aligns expectations, and accelerates decision-making and execution.

And yes, it’s very rewarding. Analytics is usually “invisible” when it works and only noticed when something breaks, so it’s often undervalued. Knowing that tools like this improve data quality, make it easier to demonstrate impact, and turn analytics into something more operational and reliable helps bring more recognition and value to a profession that usually goes unnoticed.