How Finance Pros Can Fix Broken Reporting and End the Monday Morning Problem with Ian Wong
In this episode of Future Finance, hosts Paul Barnhurst and Glenn Hopper sit down with Ian Wong, co-founder and CEO of Summation, to talk about one of the most frustrating challenges in finance and analytics: getting timely, trustworthy answers to basic business questions. Ian shares the story behind what he calls the “Monday Morning Problem” and explains why finance teams often spend weeks chasing insights that arrive too late to matter. The conversation explores the limits of dashboards, the risks of AI hallucinations in finance, and what decision-grade analytics really means.
Ian Wong is the co-founder and CEO of Summation, an AI-powered decision platform built to help enterprise leaders better understand how their businesses are performing. Before Summation, Ian co-founded Opendoor and served as CTO through its journey to going public. He was also Square’s first data scientist, where he built early fraud and risk systems. Ian holds degrees in electrical engineering and statistics from Stanford University and brings a rare blend of deep technical expertise and business leadership experience.
In this episode, you will discover:
What the “Monday Morning Problem” is and why it slows down decision-making
Why dashboards and ad hoc reports often fail finance leaders
The risks of relying on generic AI tools for financial analysis
How decision-grade analytics differ from conversational AI
What the coming “query flood” could mean for data infrastructure and costs
Ian explains how Summation helps finance and operations teams move from manual data stitching to faster, more reliable insights. The discussion also covers AI hype versus reality, why trust matters so much in finance analytics, and how leaders can think more clearly about where AI fits into real business workflows.
Join hosts Glenn and Paul as they unravel the complexities of AI in finance.
Follow Ian:
LinkedIn: https://www.linkedin.com/in/ian-wong/
Company: https://www.linkedin.com/company/summation-hq/
Follow Glenn:
LinkedIn: https://www.linkedin.com/in/gbhopperiii
Follow Paul:
LinkedIn - https://www.linkedin.com/in/thefpandaguy
Follow QFlow.AI:
Website - https://bit.ly/4i1Ekjg
Future Finance is sponsored by QFlow.ai, the strategic finance platform solving the toughest part of planning and analysis: B2B revenue. Align sales, marketing, and finance, speed up decision-making, and lock in accountability with QFlow.ai.
Stay tuned for a deeper understanding of how AI is shaping the future of finance and what it means for businesses and individuals alike.
In Today’s Episode:
[01:58] – Meet Ian Wong
[05:25] – The “Monday Morning Problem”
[09:23] – What Empathetic Leadership Really Means
[13:15] – How Enterprise Research Really Works
[16:34] – The Monday Morning Numbers Meeting
[21:25] – A Balance Sheet That Still Doesn’t Balance
[25:53] – Where AI actually helps finance teams
[28:27] – The AI Hype Question of 2025
[33:07] – Moving into Personal Questions
Full Show Transcript:
Co-Host: Glenn Hopper (00:41):
Welcome to Future Finance. I am Glenn Hopper, along with my esteemed colleague, Mr Paul Barnhurst.
Host: Paul Barnhurst (00:50):
Yes, Senator, I'm running for Senate. Did you not know
Co-Host: Glenn Hopper (00:53):
Our guest today? Also with us is Ian Wong. Ian is co-founder and CEO of Summation. Most recently, Ian co-founded Opendoor and served as CTO from inception to going public. Before that, he was Square's first data scientist, building the company's early fraud and risk systems. He holds degrees in electrical engineering and statistics to of my favourite topics, from Stanford University, one of my favourite universities. Ian, welcome to the show. Thanks for having me on.
Host: Paul Barnhurst (01:21):
Very excited to have you. I love the square background. I work for American Express, so the fraud and all those type of things. I remember more than one conversation. I think that's just a fascinating area, so totally.
Guest: Ian Wong (01:33):
Yep.
Co-Host: Glenn Hopper (01:34):
Great area for a data scientist, right?
Guest: Ian Wong (01:36):
A hundred percent. Yeah. I mean, classic ML fraud detection is one of the most typical applications of that.
Host: Paul Barnhurst (01:43):
Yeah, exactly. Write up a data scientist, Sally, machine learning, all those things. So I want to start with a question. It kind of ties a little bit back to when you and I chatted because I know you and I have had a couple of conversations over the last year as you've been building your new project summation, which we'll get into. But I loved you shared how when you were at Opendoor, you had an experience that you called the Monday morning problem, and I really liked the way you framed it. So can you talk to our audience of what the Monday morning problem is and why is this such a big problem for so many companies in your opinion?
Guest: Ian Wong (02:16):
The p and l, the income statement, and actually a lot of the management p and l was literally printed and we would all look at it five lines into the p and l. The question always comes up, Hey, why is this line Rev, Maybe it's revenue today, maybe it's growth the next day, whatever the case might be, why is this line under budget? And that question used to drive me nuts because that would kick off two weeks worth of work between finance and data, BI and BizOps. The issue is number one, by the time they come back in two weeks, the exec teams have forgotten what the question was, embarrassingly, but more importantly, the insight's no longer relevant. And that was Monday morning when you'd ask a simple question about the business and it would kick off these, again, what I think of as the expedition, the armies that people go on to stitch data from dataset number one, data warehouse number two, and so on and so forth. That was Monday. That would be total piano review. Tuesday would be a pricing and inventory review. Same thing happens Wednesday would be a marketing sales review. The same thing happens. And so it really felt like Monday through Friday was this deja vu day after day where people were just constantly asking questions, not actually understanding how the business was doing. And that was so frustrating that after I left opendoor, I decided to do something about it.
Co-Host: Glenn Hopper (03:42):
That's such a, I mean anyone who's worked in fp and a or BI that resonates with them because you've got your standard monthly reports that you do or daily reports, weekly reports, whatever they're, and you've got that. And then you have all these ad hoc reports and they're never built into the system. So it takes all this work and it also ties into, there's a big argument now around finding ROI around AI projects or anything analytics projects and the time to insight return that getting quicker view into that. It's hard to measure that to the satisfaction of A CFO where it's like, yeah, I get it, but what's the dollar value of it? It's like, well, if you're asking a question on Monday morning and have to wait until the next Thursday to get the answer, I mean the ROI is the time that you lost in number of days of being able to pull the right lever and act on it. Right? Yeah,
Guest: Ian Wong (04:31):
Totally. I think there are two aspects that I question it or that point you make plan, which is number one, the speed up. Just the fact that you have all these people doing all these manual analysis. There's that element of just how much time can I save? But more importantly, what's the depth that you can go into? What is the quality and the depth of the analytics? And I think those are the two big issues. When I think of the way that people do work as it relates to analytics and fp and a and data science. We are in such an early era, I think in 10 years we don't look back and be like, I can't believe people used to do things that way. I can't believe that people used to have to click through to dashboards, look at numbers, see what's read, and then figure out the next dashboard they should click into and repeat the same thing. That's literally how Oracle is done today. And my hope is that over the next five years, it's going to replace eyeballs on dashboards with agents and tokens doing that work for us so that we can actually not just automate the grunt work, but actually get deeper better insights.
Co-Host: Glenn Hopper (05:39):
You're preaching to the choir here. I love hearing that, arguably and very much so. It could be said that for in 2025, you actually have the perfect background to be a founder, especially in analytics and AI fields with electrical engineering statistics, Stanford, but it's still, there's a transition from a data scientist in the mindset you're in to being a founder where, I mean, I would suspect in your heart of hearts, maybe your favourite thing to do is to sit down in front of a machine learning algorithm and or two new model and all that. But what was going from that really hands-on? And I'm sure as a founder you still have a lot of this as well, but what was the hardest part? Was there a notable transition going from data scientists to founder?
Guest: Ian Wong (06:31):
That's a great question. I think there are actually lots of things that are translatable, and I think data science and finance are actually very similar in that sense in that you actually, to be a great data scientist, you have to empathise with the domain. You can't lead with a technique. You have to start with what does the business care about? What's a customer? Paul, you were talking about how you were at Amex or you worked with Amex and thought about fraud detection. Well, how do you design fraud detection systems? We actually have to empathise not just with your risk operations team, but you actually have to think in terms of the fraudster and think about all the threat vectors that are coming at you and your job is back in the day do classical feature engineering to detect that. So there are lots of things that are translatable about data science and becoming a founder, which is empathy with the domain and the customer.
(07:19):
I'd say the hardest part, honestly is a number of hats you have to wear as a founder. Yes, you have to be a tech co-founder like I did, and you have to manage data science and entering all that good stuff, but I was also a GM of pricing and so I had to oversee the p and l for the business and that was really hard. And in one moment your contact switching and how do I be a functional leader for engineering and data science? Another moment you're contact switching like what's going on with the p and l? Why is it read in these areas? How do I course correct? And so I'd say the context switching and having to get good ly in many different areas and not be picky about what you got to get good at that is hard, but it's also what makes a job fun.
Co-Host: Glenn Hopper (08:04):
So switching costs just it gets exhausting for me. You won't say this about yourself, but I'll say it about me. The hardest part in all that is, oh wait, in this role I actually have to care about people.
Guest: Ian Wong (08:19):
Well, I would say actually one of the biggest founder lessons I learned is how do you be empathetic but also direct? I think one of the really challenging things about being leader, especially when I first managing this with circa early 2010s, the topic de jour or the management term jour was authentic leadership or empathetic leadership. And it's actually really easy to misinterpret that to be application and not delegation. And so there are lots of things in which how do you effectively manage teams? And we can talk about that separately at some of the time. But yeah, lots of management. There
Host: Paul Barnhurst (08:53):
A couple things. I totally resonate with the context of switching costs so much higher now that I run my own business. Yes, I don't have a team to manage, but this call is a podcast. The next call is selling. The next one is just somebody who needs to chat the next minute. I'm trying to build a course and it's just constantly trying to carve out enough time that you're not switching so much that you get nothing done during the day because there is a real cost to that. I think AI can help with that some. And then the second one, you mentioned fraud and fraudsters having been data science, you'll appreciate this. I worked in prepaid at amex and we had one guy there that anytime they wanted note a question of how the fraudsters would work, he had learned how to maximise reward points and how to do everything legally to game the system.
(09:42):
So you could question the ethics for sure, but he always kept within the legal bounds. So anytime they had, it's like you do realise you ran that promotion, all you did was got a bunch of scammers and here's why. Or you did this and you realise that was terrible. Go look at the data. Let me show you how much money you lost. And he was just a master at basically what is the equivalent of fraud in the sense of where are you losing money? It wasn't legally fraud, but still taking advantage of the offering. So it was really interesting. Alright, let's talk summation here for a minute. I think, I believe you started summation almost 18 months ago. I know you're in stealth for quite a while. I think we chatted it for the first time, what was it, probably eight, nine months ago.
Guest: Ian Wong (10:21):
That's right.
Host: Paul Barnhurst (10:22):
And so why don't you share with our audience what summation is, what you do, why you started this? Give us a little bit of that founder story of summation.
Guest: Ian Wong (10:32):
Yeah, so I started the company because of that Monday morning problem and I became really motivated to ask what have enterprises actually had a summation layer to all their data and all their operations? A layer where I can actually interrogate and have a tactical feel of how the business is doing. My analogy is kind of like the F1 car, right? You want to be able to steer the business and run really fast, but you need to feel how the business is moving. And today your operation and that feedback loop are so detached and that's what makes it really hard for leaders to operate. So what do we do? We are an AI platform that helps enterprise leaders transform all the complexity across data and business into clarity and action. And what customers use us for are they use us to help them automate reporting, generate deep insights and more generally improve their decision making and operating cadences, whether that's a weekly operational deep dive to a monthly or quarterly business reviews. These are all the different ways in which we can help our customers. And the value that we really bring to our customers is that they're able to get this real time tactical understanding of their business and find strategic growth opportunities. I think it's so hard to actually understand how the business is doing and all the things we just talked about where it involves armies of people weeks to figure out what's going on. We want to condense all that and give people really fast feedback cycles.
Co-Host: Glenn Hopper (12:04):
So right front and centre on the summation website, it says decision grade, AI platform for enterprise leaders. And I think my first question when I saw that is how is your deep research, I don't know if that's even what you refer to it as, but how is that different than what I could get from chat GPT or Grok or any of the other research models out there? Yeah,
Guest: Ian Wong (12:25):
That's a great question. One of our marquee features is we call a deep dive. And before talking about what a deep dive does, lemme give you an anecdote from the field. I've been chatting with a lot of CFOs, COOs, and other enterprise leaders. One person that I've been chatting with a lot is the VP of data, the VP of data, and this is in more than one case, have told me that the CEOs of multi-billion dollar companies are going to Snowflake or Databricks or Open or Redshift or any of these warehouses. They're taking some data, putting it into CHATT and doing vibe analysis and output. And after this vibe analysis session, they would take the output, toss it over to the VP of data or VP of finance. Well hey, you just check this. I think this is what we should be doing with our strategy. And the VP of data and VP of finance are like, are you kidding me?
(13:21):
Where do I even start? Because there's hallucination, left threat and centre number one. And number two, people think about Vico and frankly a lot of AI slop. Now there's strategic swap, there's analysis swap, and we want to be the counterpoint to that, how you think and picture yourself in that Monday morning meeting we talked about picture yourself as a GM of a major business unit or CE or what have you, and your VP of finance or VP of operations is presenting you with, here's how my business is doing. What do you expect? Well, you expect number one that it should be correct. The numbers in there should be correct. Number two, you expect it to be defensible, which means if I poke at a number or if I poke at a line of reasoning that you can trace it through to the source. You're able to justify your thinking, your calculations. And number three, you expect a level of strategic depth and you expect to have the context of the business. You expect that your VP or your GM is telling you something.
(14:30):
Those are different elements that we really dig into our deep do. Number one is correct. So every number is correct and we force LM not to spit out numbers but actually spit out the way to get to the number. And we actually do many checks before any numbers presented to our customer. And every number that is shown is traceable back to the source. And we actually, so much of our work is actually this multi-agent system to help our customers get there. One more thing I'll mention is that we don't deliver answers, we deliver deliverables. As you think about a weekly business review, it's not like an answer. It's not like output from a chat that you want to see is actually a structured document. So we work with our customers to structure, Hey, here's what they mean by their weekly operational deep dive. Here are the sections, here are the topics. How do we break this down such that this multi-agent system can give you this decision grade report back to the users?
Co-Host: Glenn Hopper (15:30):
It's so funny because I'm picturing sitting in that Monday morning meeting, and if you're the CFO head of finance or whatever, and you're getting grilled on your numbers, I could see in that meeting actually wouldn't work for a huge company, but a small enough company where you have a number of ledger entries that could fit into the context window of ai. I could see in that meeting dumping the whole GL or whole trial ballots or whatever into clawed and saying, help please what's going on here? And then, because that's kind of the dream, and that's where I think a lot of people run into problems. I've been testing clog for financial services 2.0 that I don't know if it's been officially released yet, but it does amazing things with Excel and you can tie it into data rooms and it has all these great features and outputs really nicely formatted Excel workbooks.
(16:20):
But if you're going on the fly, if you're not human in the review, there's not an inherent reason to trust that more than any other AI other than it looks really professionally put together, which that might be. The fact that it looks so good might be even more of a reason to be duped by it. So having that cross-checking against hallucinations, I know we haven't eliminated them completely, but that's huge for finance. You can be in grey area. I always say with sales and marketing, because you can not on your analysis, but if you're writing marketing copy or whatever, obviously there's a lot more grey area there, but with we can't be wrong. We can't be the whole probabilistic nature of finance numbers.
Host: Paul Barnhurst (17:02):
I mean, Google can't go public and say, oh, sorry, our AI gave the wrong number. We really earned 4.3 billion last quarter. That's not going to go over well.
Guest: Ian Wong (17:12):
But the other thing to really hone in on too is there's a workflow, right? There's an existing process where you have to review your ledger for variances. Maybe it's a monthly flux process that you go through. So there's an existing process already. And one of the challenges I see with most conversational analytics is that it's not a conversational analytics problem. I'm trying to put this flux analysis together. These already do is that these data sets, these are the questions and this is the format I want to see the flex analysis in. That's a workflow. And so what we do is that we deliver the deliverable, not just answers from a self-service conversational analytics platform. By the way, we do have a self-service conversational analytics part to our products. And what we see actually from actual usage is that, to be honest, most users don't even know what to ask, right?
(17:59):
It's kind of like in the BI world, circa 2010s where Tableau and Looker and all these things, and you have all these BI teams that made all these effectively what is now known as ontology. And the idea is that, hey, you can go in and configure your own dashboards. It's just a pivot table. Well, it turns out actually even despite all how easy it is to pivot things in these BI tools, people still don't. And we're seeing the same thing with conversational lx. There's a chat box you can literally ask at anything and get information in your fingertips. The challenge that people don't ask, however, there is a workflow, there was a meeting, there was a weekly business review. You need to present information in that meeting. So that's what we're really focused on.
Host: Paul Barnhurst (18:39):
Ever feel like your go-to market teams and finance speak different languages? Misalignment is a breeding ground for failure in pairing the predictive power of forecasts and delaying decisions that drive efficient growth. It's not for lack of trying, but getting all the data in one place doesn't mean you've gotten everyone on the same page. Meet Q flow.ai, the strategic finance platform purpose built to solve the toughest part of planning and analysis, B2B revenue QO quickly integrates key data from your go-to-market stack and accounting platform then handles all the data prep and normalisation under the hood. It automatically assembles your go-to market stats, make segmented scenario planning a breeze and closes the planning loop, create air title alignment, improve decision latency, and ensure accountability across the team.
(19:46):
That makes a lot of sense because they're easier to structure that and to bring it back each time you have a workflow behind it. It's like I just did a webinar before we jumped on here. We've all seen vibe coding, you talked about slop and all that. Now we're seeing vibe working. Microsoft's leaning into that. Just tell Excel what you want it to build. And I told everybody, the majority of you would get more benefit by getting better at Excel and modelling than trying to use ai. And when you talk, sometimes it'll look perfect. We had one where built the moat from a format standpoint, build a beautiful integrated three statement model, a different assumptions tab schedule did a really good job and then it was out of balance. So the balance sheet didn't balance and we asked it to fix it, spent some time, got a little closer, and finally it came back and said, it's only a 1.3 million variance. That's close enough. I'm not looking any further. And we went, if I'm an intern and gave that to my boss, I just got fired. Sorry, I'm not looking any, it's close enough. It just totally made us laugh like wow. See when you talk a slop, that was the example. The other thing it did is it reversed the number formatting and made positive numbers negative with custom formatting and negative numbers positive, but only did it on two lines in the entire model. You know how long it took us to find that?
Guest: Ian Wong (21:02):
Yeah. Oh my gosh. Well, going back to operating cadences, the reason why we focus on that a lot is companies have a sense of what good looks like. People are already doing it. And so in many ways that defines eval, right? It's not like a de novo deep research. And by the way, if you get deep research, one of the issues that I see with a lot of, again, these platforms that you get a bunch of SQL queries now thrown at you, well now are you going to check every single query, right? Like, oh my gosh, yes to saved me a bunch of time. But go to your point, checking everything in this Excel workbook, checking every S thing that's thrown at you, that's really hard. So a lot of what we focus on is building an environment that's similar to our customers. And again, the enemy is how work is done today. The enemy is all these people going out to all these data silos and manually switching things together. How do we streamline all that for our customers? And
Co-Host: Glenn Hopper (22:00):
Going back to what you said about not knowing what to ask it, and when I talk to finance professionals all the time, I say, well, you still have the domain expertise of being a finance person because if you don't know the difference between EBITDA and net income and operating income, you don't know the right questions to ask. And the same thing on some levels. The biggest barrier to entry around data science was you had to be able to code to do it, but now if you can vibe code your way into data science, but if you don't know how a clustering algorithm works or a K nearest neighbourhood, I dunno whatever algorithm you're using, then it might as well be a black box. And it might as well be like, what was the old Dragon data robot where it built amazing models, but you put it in your wrong people's hands and they're using the completely wrong algorithms for their predictions. So it is on one hand, yes, you have access to it, but it's also a lot of this is taking hallucination out of it, but it's a lot of power in the hands of someone who may not have a background and know the right questions to ask or understand how they got the results, even if they see all the code or the SQL queries right there in front of 'em.
Guest: Ian Wong (23:10):
Yep, yep. So that last mile problem is a big one, and that's what we're looking solve.
Host: Paul Barnhurst (23:14):
Yeah. So I want to get to an article you recently wrote called The Query Flood. So you said about how it is coming, and I think this really gets back to analytics and somehow we talked about the dashboards data and just you said with the query flood that's coming, you mentioned it might take down our analytics infrastructure. So can you describe to our audience what the whole query flood is, why it's such a concern, why you think it's going to be a problem?
Guest: Ian Wong (23:45):
Yeah, so I wrote that blog post because I've been looking at how our agents in summation work and they're issuing a tonne of queries. So when we do a deep dive in summation, it can take minutes sometimes up to an hour to run. And the reason why is we're splitting up 20, 30 agents and it's looking at every nook and cranny of your business, cross-checking each other. And one of these runs over the course of an hour, it issued over 7,000 queries against our database. And part of what we had to do is build an AI calculation engine that cannot accommodate all these parallel queries, but it's a lot of queries. And it reminded me of a time when back at Square and I encouraged the audience to read the blog post, but back when I was at Square, the whole payment stock crazy enough was ruins one monolithic Ruby on Rails app and the dashboard was actually connected to the same database as that runs the transactions.
(24:44):
And so if anyone knows, you shouldn't be running your analytics on the same operational database as their up payment system. But we did because as a startup, we were like 40 people at the time, and the PM at Square at the time was frustrated by how slow the dashboard loaded. So he just rage held down command on, and that just issued like 30 or 40 super expensive queries against the database and suddenly those site went down and people didn't understand why payments stopped flowing. And so it was a fun postmortem. The experience of building this multi-agent system kind of reminded you of that, where effectively the agents are the ones that are press and command are across every single dashboard in your business. And think about that flood of queries are hitting our online system and then coming gear, right? Think about a world where humans are no longer issuing the queries, but the agents are issuing the queries.
(25:43):
And so the amount of analytics and amount of querying that's going to be done, I think it's going to be 10 x, maybe even a hundred x in the coming few years. I think we're already seeing that in software engineering, the amount of code that's being written, the same thing's going to happen to analytics. And so my point is, and by the way, the CT and part of my job is to keep my data warehouse and data platform bills down and I would get hard applications whenever I get a bill from the data warehouses of the world. It was like multimillion dollars growing like 50% year over year. Now you're telling me that agents are going to scale that number up by 10 XA hundred x. That's kind of scary. So my point is actually how we think about analytics and the whole data infrastructure stack needs to be there needs to be fresh thinking because the world's moving to a very different place. And how we got to where we are in many ways as Anhang an overhang from Hadoop and how we think about what's going on, where we headed again, we need to have some fresh thinking there.
Co-Host: Glenn Hopper (26:43):
God, that's so funny. I was going to bring up Hadoop when you were talking through all that. And then I was just picturing everybody's data lake, data warehouse, data lake house, whatever, just together with duct tape and bailing wire and what it would do to 'em to have 10 XA hundred x more queries on this data. So yeah,
Host: Paul Barnhurst (27:00):
I'm sure there's a few people, we gave heart palpitations to some of the CTOs, the list in there and the data people are just like, oh, I could just see the system crashing.
Guest: Ian Wong (27:10):
And the cost. And the cost.
Host: Paul Barnhurst (27:12):
Yeah. Then there's the person who looks at the cost part going, wait, I'm paying 2 million now you're telling me it's going to be 20 million.
Co-Host: Glenn Hopper (27:17):
That's right. Alright, Ian, we've thrown you a lot of softballs. I'm going to now ask you the hardest question of 2025. This is the question. So I'm getting you to do my work for me. Here is what I'm hoping. It's a question that I get every day and it is, we're going to skip the ROI part of it, just take ROI out of it. But we are, if you've seen the latest Gartner hype cycle report. So AI agents in particular are at the absolute peak of the hype cycle right now. So there's all this fear. There's fear around missing out. You're hearing all these tech bros say half our company is agents and we talk to 'em like they're employees. And then on the other side you've got people who aren't doing anything, so they've kind of got a fomo. And then there's also fear of the technology because of everything we talked about with hallucinations with not knowing when you can trust it, not understanding how it works. But since there's so much noise around the hype cycle right now, how would you advise employees or companies in general to figure out where to apply it? And maybe we make this specific degenerative ai, but you can go broader if you want, but where can we apply AI in our company to get the most tangible benefit considering where technology is right now? Totally.
Guest: Ian Wong (28:32):
And it's a really great question because like you said, everyone's talking about ai. I was just chatting with literally partners at some of the largest consulting firms, think McKinsey, B, C, G, and VAs in the world, and they're talking to their clients about ai. Everyone's talking about ai, but no one's really doing ai. And that's kind of where we're at all this hype. And honestly, by the way, I'm starting to see the trough of the solution me, where people are like, well, I don't trust, honestly, it's actually affecting vendors like us where I don't trust AI because it's hyped it up to be this big thing. And I tried chat two B over my data and it doesn't work. So why would you guys work? And the points of, well, that's exactly why we're here anyway. But the point being we are kind of in this weird moment in time.
(29:17):
What I would say, and I'm sure everyone's sharing the similar advice, I think as a finance team, we have to work both forward and backward. Forward, meaning go log into at vt, try it. I do think it's really valuable to understand the limits of the technology and where it can deliver a tonne of value. So try that. Try advanced prompting technique, just get a subscription and try things out. Try cha Tea, Gemini, Manus, you name it. Just try them all. The other part of it is working backwards though, right? And what I mean by that is what is a business process? What is a workflow? Where do we think AI can help the most? And classic like, hey, where are the bottlenecks in this process? Maybe it's financial operations and there are things around reconciliation that's really painful. Great, can we apply it there? I've got to put together this weekly business review. Hopefully you'll consider summation, but if you want to do it yourself, fine, what are the bottlenecks in that kind of workflow? So the other part of it's like what are all the parts of my job that frankly sucks because it's super laborious, let's work backwards from that. So I think it's a bit of learning and going with the platforms as they get better and also doing an honest accounting of all the ways in which you think your job can be improved.
Co-Host: Glenn Hopper (30:31):
I love it. You've just validated what I've been telling clients too, so thank you for that. And Paul, I'm going to do a shameless self-promotion here. You were talking about building workflows. I have a new course on LinkedIn learning building finance automations with N eight N. So all of our LinkedIn viewers, if you're on premium and do the LinkedIn learning, you can check that out.
Host: Paul Barnhurst (30:50):
Look how fancy you are. And another course, I think we've covered so much on the whole hype disillusionment, but I just totally agree. People just need to get in and start learning and trying and experimenting. And that's why I tell so many people is you got to start somewhere. Start small, learn and look at what others are doing. Learn from experts in the space like Glenn and so many others out there that have done a lot of the work. The reality is yes, there's hype, but if you choose to just think it's all hype, you're going to miss out because I can guarantee you it can save you hours. There's no doubt. No matter what your role is, I don't care if you're in finance, marketing, sales, anything. If you work any type of job, it can save you time. But you have to be willing to put in time.
(31:40):
And generally you have to know what you're doing so you can validate what it's giving you. So many people think for a longest time, well, AI killed Excel. No. Well, AI killed coding. No, it's a magnifier. That's what I keep telling people. If you don't know what you're doing, it's going to magnify that. If you know what you're doing, it's going to magnify that and it's super exciting. So I love what you shared there. A lot of great advice. Alright, we're going to move into our personal questions. Kind of funny enough, normally it always rates them one through 25. This time it counted to six questions. I gave it for the episode and gave me seven through 31. It didn't give me one through 25, it took me a couple minutes to figure out what the heck it did. I'm like, why did it start at seven? And so instead of picking a number between one and 25, and Glen does this the different way, we'll let him go next. There's two options. We can let the random number generator, if I can speak here, pick a number between seven and 31 or you can pick a number between seven and 31. And I'll ask that question.
Guest: Ian Wong (32:42):
I will let the random number generator have at it.
Host: Paul Barnhurst (32:46):
All right, and I got to set it up because I had it set was five, so we'll change it here.
Co-Host: Glenn Hopper (32:51):
Do I change your parameters there?
Host: Paul Barnhurst (32:53):
I do. Alright. It me. It gave me 10. So let's see what 10 is as square's first data scientist, you expected complex fraud, but found only 50 cases. What was your most hilariously wrong assumption about
Guest: Ian Wong (33:10):
The job? Yeah, so I took a job at Square because at the time I was a board graduate student at Stanford and I had just interned at Facebook and I realised, oh my God, what am I doing in grad school? So I dropped out and literally had two conversations, one with Keith who was the CEO at the time, and later on my co-founder opened door and Jack Dorsey. And then I got the job and there was no technical screening. I was actually a little sketched out, so I didn't know what to expect out of the job. And they were just like, Hey, we're a payments company. There's going to be fraud and just come in and build machine learning algorithms. I'm like, great. That's what I studied in school. That's what I'm great at. I'm going to go do it. And like you mentioned Paul, the first day I showed up, I was like, great, I'm here to build some models, show me the fraud.
(33:53):
And the company was a year and a half old. There was 40 people, there was no fraud, there was 50 cases of suspected fraud. So I'm like, what am I doing here? There's no data science, there's no fraud. But that was such a blessing in disguise because that's the thing with startups, you kind of go in thinking you can do one thing, but honestly you have to wear all these different hats. So I started by building tools for the risk ops team and then eventually analytic infrastructure and eventually the ML side of the house. But honestly, I thought I was going to be doing one thing entering square, but I ended up doing 10 other different things and that was awesome.
Host: Paul Barnhurst (34:29):
All right, you're up, Glen. Let's see what the AI picks for in here.
Co-Host: Glenn Hopper (34:33):
Alright, so yeah, so I take the human completely out of the loop here. We just turn it over. If AI generated the questions and AI
Host: Paul Barnhurst (34:40):
Could hallucinate for all, we
Co-Host: Glenn Hopper (34:41):
Can pick one. So alright, actually, I really like this one. I think you're going to like it too. Your Twitter handle is at ihat from physics. How would you explain your career? Wow, this is a weird one. How would you explain your career using only vector notation? Only AI could have come up with this question.
Guest: Ian Wong (35:02):
Oh my gosh. Yeah. By the way, I had to change it from IHA to Ian Wong underscore because people were just not getting iha. So I'm very surprised. I mean, it's a good thing that the algorithm,
Host: Paul Barnhurst (35:13):
Well, you know what? It's also called a Twitter, so who knows how old that was versus X.
Guest: Ian Wong (35:17):
That's true. Yeah. Now there's all this AI stuff, and maybe this is getting a little bit abstract, but maybe we're all just a gradient descent for this giant universal algorithm that we're all part of. So I'm just contributing my own little tiny iha.
Co-Host: Glenn Hopper (35:32):
Love it. And now we're going to give you 10 minutes, and that was a brilliant answer by the way. You could use that on your personal statement for Stanford, and now we're going to give you 10 minutes to just talk about gradient descent and all the phone.
Guest: Ian Wong (35:45):
Yeah, maybe just the cyber account might take a bit of a dip
Host: Paul Barnhurst (35:47):
After. Well call anyone who wants to know what it is to go chat GPT it instead of Google it, right? That's
Co-Host: Glenn Hopper (35:54):
Right.
Guest: Ian Wong (35:54):
There you go.
Co-Host: Glenn Hopper (35:56):
This has been a great episode and they go by it so fast, but I'm literally grabbing my bag and walking out the door right after this. So Paul, any famous last words as we bid ado to Ian Iha? No, just
Host: Paul Barnhurst (36:09):
Kidding. Thanks for joining us and it was a real pleasure.
Guest: Ian Wong (36:13):
Yeah, appreciate it. Thank you so much. Thanks Ian.
Host: Paul Barnhurst (36:16):
Thanks for listening to the Future Finance Show. And thanks to our sponsor, QFlow.ai. If you enjoyed this episode, please leave a rating and review on your podcast platform of choice, and may your robot overlords be with you.