Future Tech: An interview with Isabel Hoffmann, Founder and CEO of Tellspec
In this Future Tech interview we’re speaking with Isabel Hoffmann, founder and CEO of Tellspec where she leads a team of researchers, mathematicians, software developers and designers aiming to build a healthier world by empowering people to make informed choices about what they eat. Tellspec has developed a handheld food scanner that connects to a smartphone to inform the user about allergens, chemicals, nutrients, calories, and the ingredients present in any food item. Isabel is an entrepreneur who has successfully founded eight companies over the last 26 years in the fields of preventative medicine, healthcare, technology, mobile health and education. Her natural ability to lead and inspire has been recognised with numerous awards and honours.
Shara Evans (SE): Greetings. This is Shara Evans from Market Clarity, and it is my absolute pleasure today to be speaking with Isabel Hoffmann, the founder of Tellspec. I discovered Tellspec a couple of months ago when I was doing some research on new technology into the healthcare industry, and somehow I stumbled across the work that you’ve been doing in the area of analysing food with your spectrometer technology. I was immediately fascinated and wanted to talk to you straightaway.
You’ve got quite an impressive background. You’re an entrepreneur who has successfully founded eight companies over the last 26 years in the fields of preventative medicine, healthcare technology, mobile health, and education. As far back as the ‘90s, you were running a gaming company, and then, after 2001, you moved into the healthcare industry. That’s really quite a fascinating shift.
To start our conversation, can you give us a bit more insight into your background and the reasons why you decided to develop Tellspec?
Isabel Hoffmann (IH): Hi Shara, nice to meet you. My academic background is that I’m a mathematician, but I’m actually a doer and problem solver. I like dealing with problems and finding solutions to them. I’m fairly creative, and I typically think outside the box.
If you really follow mathematics, a lot of the pure mathematics theorems have been proven by people who think outside the box. A perfect example is Fermat’s Last Theorem, considered for over 300 years to be the most difficult mathematical problem. It was proven by a mathematician named Andrew Wiles in 1995 using all sorts mathematical theorems borrowed from many scientific fields and not necessarily from number theory where the actual conjecture originated.=
Tellspec was founded because my daughter had many food allergies, and she needed to make sure she was not exposed to those allergens, so going to restaurants, or parties or even eating at school became a safely issue. My daughter also suffered from a pre-diabetic condition, so tracking net carbs and calories was also very important. Tellspec was born from the desire to help my daughter and others like her to improve their health. Tellspec is really a data company that provides the most relevant, useful and accurate information about food because we all need to know that the ingredients in our food are nutritious, safe and authentic.
SE: Well, I love the fact that your background includes outside-the-box multidisciplinary thinking because in so many different entrepreneurial companies and scientific advancements, I’ve seen that the multidisciplinary approach is really the best way to solve problems.
An example would be, in computer algorithms, if you bring in people with a biology background, who understand the way that things work in nature, those same sorts of naturalistic algorithms can be applied to computer science problems. I’ve seen it time and time again.
Like yourself, I’m also very interested in what’s in the foods that we eat, and I know there are ingredients in many foods that most people aren’t aware of, and they’re certainly not aware of how those ingredients might impact their health.
Let’s turn a little bit now to Tellspec’s technology that you and your team have developed. From what I understand, the technology allows you to determine the chemical makeup of food. The Tellspec device shoots a light at the food and then sends results to software that you’ve developed in the cloud for analysis. So you’re actually providing what I would call predictive intelligence about food. Can you perhaps round out this concept with some more details on how Tellspec actually works?
Putting spectroscopy in the hands of consumers
IH: Okay, so the first thing I have to say is that we actually did not invent spectroscopy. This is a science that has been used for over 50 years. Tellspec’s algorithms are agnostic to the type of near infrared spectroscopy used but currently our small food scanners use a spectrometer that is based on Texas Instruments’ DLP [digital light processing] technology. We have an agreement with TI and we have been working with them for well over a year now.
SE: Okay, so you’re using field-proven spectrometer technology?
IH: No, it’s not exactly fully proven technology.
IH: Texas Instruments has developed a DLP chip, which stands for digital light processing. It was originally developed for projectors, like cinema projectors, or business projectors that are used to make business presentations. The DLP chipset can actually count the number of photons received at different wavelengths.
Until recently, it would have been very difficult to make a spectrometer with this wide wavelength range and this small, but the DLP chipset replaces a series of different detectors making it possible to downsize a spectrometer and not lose the wavelength range. In broad terms, the hardware technology works as follows. You shine light at the sample — in this case, food — that you want to analyse. The wavelength of those photons is measurable and is the same for all photons. Once the photons hit the different particles of the food sample, the molecules absorb some of the energy and reflect back those photons. These reflected photons have lost some of their energy. The spectrometer reads how many photons are coming back, and what is the different energy they’re travelling back at. Essentially, this information on the number of photons and their respective energies give us a unique fingerprint of the food sample.
TI’s DLP technology is absolutely amazing and its application to spectroscopy holds immense potential for the Internet of Things. TI has made a major investment and many years of research to develop this technology.
Tellspec’s strength is in the analysis of the information received by the spectrometer, so the user can obtain relevant information about the food they scanned. This gives the user food analysis on the go. We take in the raw spectral data from the light reflected from the food, and we are able to analyse that to uncover nutritional information about the food sample scanned.
SE: Okay. I’ve got to ask you a question there, Isabel, and that is: with the analysis of the reflected photons, are you relying on a scientific database or some other pool of known knowledge of molecular analysis, or is this something that you and your team have developed?
IH: It is detection technology that Tellspec developed and has patented. Typically, scientists that use spectrometers and infrared spectrometers will use certain software like GRAMS or Unscramble to do some analysis of what molecules they are seeing. We do not use any third party software to do our detection analysis. We have developed our own algorithms and our own detection engine by combining bioinformatics, chemometrics and machine learning. It’s interesting that you mention people with a biology background who develop algorithms. We have those types of people in-house at Tellspec. For instance, we’re able to look at the spectrum obtained from a scan and know if the sample has gluten or not.
This has never been done by anyone before. The people that work with us are very much interdisciplinary people that came from biology, from chemistry, from physics, from mathematics, and from spectroscopy, and as well as computer science. It’s a mix of people; and it is great fun to see how each thinks and it is amazing the collaboration that goes on.
SE: I can see where that would be really powerful.
IH: Yes, and by the way, it’s difficult to find these type of people. So if someone out there wants to work with us, they should know we are hiring. It’s difficult because, if you just grab graduates from computer science, you get the people that can program and can do machine learning, but they really do not understand anything about chemistry or biology. Then sometimes the chemist or the biologist doesn’t understand how to program or do machine learning. Furthermore, these two groups often know little or nothing about spectroscopy. And physicists have got the optical and the spectroscopy part, but they don’t understand the chemistry or machine learning.
I look at it as a type of “globalisation” of scientific thought. We reached a point in our technology innovation where the knowledge contained in one science alone, such as chemistry, cannot answer the new scientific questions; the answer lies, like the proof of Fermat’s Last Theorem, in borrowing knowledge from other sciences. Very similar to the economic point we reached a few years ago, when companies needed to globalise their efforts in order to find resources out there to produce their goods at cheaper prices than they could do in their homeland. Suddenly, we could not advance any more in technology unless we started mixing all these sciences together. We are talking about the Internet of Things, are we not? In the case of Tellspec, the “thing” is food.
IH: That’s where we are. The Internet of Things is really mixing all the sciences together.
SE: I actually know some researchers here in Australia that may fit your mould. We’ll have to talk offline about that as well.
But I wanted to ask you about the patent process. When you put in your application that, say, showed your software was able to detect gluten, as part of a patent application, do they test that, or do they just look to see whether anyone else has applied to patent similar technology?
IH: The patent office does not conduct scientific experiments to validate a claim or a set of claims. They look if there is someone else that has made those claims and they decide which claims to allow and which to disallow.
SE: That’s fascinating. How would you test your results against the real world?
IH: We’ve done some validations with different groups. We’ve used about 25 people — in California — that took the scanners and used them and gave us feedback. Those people actually were all diabetic or pre-diabetic.
We also do internal validations. We’re just going through that process with the several scanners we have here. We have about 10 scanners, and we’re validating the results. Do they perform correctly with our algorithms or not? If they don’t, why don’t they? What accuracy can we expect?
We also now have several groups that want to do third party validations and we will soon be shipping scanners to those groups. We hope they do scientific validation and publish their work.
SE: Well, that’s right. You do need that sort of external validation. For your internal validation, would you wave a scanner over, let’s say, bread that you know had certain ingredients and just check that it’s coming up with the right ingredients, or a pre-prepared meal, and manually validate that the results from the spectrometer and your analytics engine match known ingredients?
Not so simple scanning
IH: Yes, that’s one way of doing it. We do a lot more than that. For instance, we test if scanning the same place gives you similar results.
Now you have to understand one thing: this is not like scanning a label. What we see and how we get the light reflected back is dependent on the position of the scanner on the sample, because the sample may look uniform to us but will likely vary at the molecular level. Some areas of a cake may have a little bit more sugar, less fat, whatever. The results are similar but not identical. That’s the first thing. We’ve done a lot of testing, scanning different areas of a sample to see if we get consistent results.
This is one of the things that worries us the most in terms of bringing this technology to the consumer, because a consumer is used to looking at the label and saying, “Aha! This is what it has.” The customer doesn’t necessarily realise that the label may not discriminate particular ingredients. For instance, sugars are typically never broken into different sugars. Typically, the label simple says sugar. But it could be fructose, glucose, maltose, etc. Also labels do not identify by-products of the manufacturing process. No label would say that the product contains acrylamide, a by-product of cooking starch at high temperature that is actually a carcinogen. Also labels are often imprecise because label regulations worldwide allow the ingredients listed to fluctuate up to plus or minus 20 percent.
SE: Twenty percent? That’s a lot.
IH: Yes. It depends on the country, and it depends on the actual ingredient, but all the way up to 20 percent, so that the food manufacturing industry doesn’t have to relabel every time that they make small changes.
SE: That can be a serious problem with people who have allergies, especially severe allergies.
IH: A serious problem also for diabetic people.
IH: I think this is more serious for diabetic people that are counting net carbs and calories. For allergies the label typically would say, “May contain a certain allergen,” but for diabetic people this is a big problem because 20 percent more on net carbs makes a difference in the amount of insulin they need to take.
SE: Yes. If you have 20 percent difference on five different things that you eat in a day, that really could add up.
IH: Yes, exactly. The study we did with diabetic people really showed us that they end up by having their own unique and individualised diet. It’s very difficult in the beginning when you’re diagnosed as diabetic to get used to counting net carbs and calories, but eventually, within five to 10 years or so, most diabetic people end up by having their own way of doing things, knowing what they should avoid, often not knowing why, but knowing from experience that they should avoid it.
SE: It’s difficult in a restaurant. If you’re preparing your own food, you have control over what ingredients go into it, or if you’re buying something in a supermarket you might look at the labels and make decisions accordingly, but when you go out somewhere, you’ve got no idea how something might have been prepared.
Food allergy suffers can eat with confidence
IH: That’s really the value proposition of Tellspec. It’s exactly what I said earlier: I wanted my daughter to be able to go to a restaurant and eat with confidence and not think that she may end up in the hospital after she finished a meal. At home you control the ingredients you use to cook, but when you go to a friend’s house, or to school, or to a restaurant the situation is very different. How many times did my daughter eat gluten at school and actually feel bad and miss the next day in school? Many times, although the school knew she was celiac, and the food she ate was supposedly gluten free, but you never know. There can be contamination and sometimes even ignorance on the part of whoever is cooking the meal. Look at icing sugar: no one even dreams that it has gluten, but in North America unless it says gluten free it likely has small amount of wheat flour and therefore it has gluten.
SE: Yes, and it gets really hard to test for all these things yourself, especially when you’re out and you’re hungry and you just want something to eat.
SE: Let’s go back to how Tellspec works. You’ve got a tiny scanner, and you’re basically shooting light, not a laser, but just light at the food, and you’re getting information back from that light scanning. From there Tellspec is communicating, I’m assuming, with an application on your smartphone, and from the smartphone to your predictive analytics that are cloud-based. Is that correct?
IH: Yes, that is correct. Then information from the predictive analytics goes back to the app to be delivered to the smartphone, or the smartwatch, or the tablet, whatever it is that the person is using.
SE: Okay, so what sort of things would a person see looking at the app?
IH: The app actually is something that we can get you to download if you want. It’s already approved in the Apple store, but we’re right now controlling its public release until we deliver the remainder of the scanners to the groups I mentioned earlier. This could be something you would may be interested in.
You can use the app without a scanner. The app contains an encyclopaedia of food ingredients and health-related implications, bad or good. This encyclopaedia allows you to type in the name of any ingredient and know in a flash the relationship between that ingredient and the your health.
That encyclopaedia can be used without the scanner at any time. Say you are in a supermarket and the label reads maltodextrin. To know what maltodextrin is you can simply use the encyclopaedia inside the app. We call it “TellSpecopedia”.
The Tellspec app also has the ability to receive the results of a scan and store that scan so it is assigned to a certain meal, in much the same way that a lot of the apps that track your calories and count your carbohydrate intake work. The Tellspec app also does that, except you do not have to enter any information about the food; you simply scan it.
Someone told me that the Tellspec app is like My Fitness Pal on steroids. It is a good analogy. No more data entry… just scan it.
We use the Apple HealthKit as a default for your daily consumption of calories and macronutrients. You can either allow these defaults or set your own. Then you can see if you are below your target or above. Our app also allows you to compare two scans and this is particularly useful if you are trying not to go above your caloric of macronutrient targets. So you can decide what food you should eat.
There’s another component of our app that is of interest, especially for those with allergies; the app allows you to track what foods you may be allergic to or reacting to. It is one thing is to know that you are allergic to a certain food, but another thing is to know that you suffer from some allergies but you don’t know what you are reacting to.
SE: Right. That’s a big problem, too. You know you don’t feel good, but you don’t know why.
IH: Yes, and that problem — as I know from experience with my daughter — is a really serious problem because it takes months to pin down what exactly is making you sick. You may do allergy tests, but the allergy tests are not exact — because they are ELISA [enzyme-linked immunosorbent assay] tests, they are specifically for that day, for that moment. They’re not necessarily comprehensive either, and we know now that our microbiome has a huge influence on what we react to. For example, I was commenting this evening that I used to have problems digesting chickpeas, and for a long time I actually took enzymes to help me digest beans. About two years ago, I stopped having that problem, maybe because I have been taking probiotics for the past three years. I now produce those enzymes, which is great.
SE: Isn’t that funny, how your body changed and suddenly was able to do it on its own?
IH: It’s all to do with the microbiome. All the bacteria in my gut flora have changed. I do a lot of things such as taking probiotics to change the microbiome of my body to make it a healthy microbiome.
What the app does for this situation is allow you to say, “Okay, I just ate what I scanned, and now I do not feel so well.” So you can answer a series of questions, such as “I had a rash. I felt bloated. I had constipation,” whatever is the problem. Eventually, you can look back on the scans for each month and say, “Every time I ate, say, milk, I actually felt bloated, so the common problem here is milk. Maybe I am lactose intolerant.” Some people find it surprising that suddenly they could be lactose intolerant, although they have drunk milk since childhood and had no health issues. But it is very common that people do actually become lactose intolerant with age because we stop producing the lactase enzyme to digest milk. What’s not so common is my situation, which is not being able to digest something, and suddenly I’m able to do so.
SE: I know people who feel absolutely awful after eating certain kinds of food. They’ve gone through allergy testing, and they still can’t figure out exactly what it is they’re allergic to. This could be a godsend to those sorts of people.
IH: Yes. It was based on my daughter because she had to do an elimination diet, which is essentially a diet where you take everything out of your diet and you start putting one item after the other back in. It’s a 21-day process and of course after the first week, you feel you are about to drop because you’ve eaten so little.
SE: I wanted to go back and ask you a question when you were describing how it works. Let’s say I’m in a supermarket, and I want to see whether produce has chemicals on it. I take Tellspec and I scan and I decide that I’m not going to eat something or not going to buy something. Will it store all my scans automatically, or do I have a choice of being able to tap and store only scans of the things that I eat?
IH: Only the things you want to store. You have to tell it to store the scans. By the way, you can take a photo of the item scanned and embed the photo into the result given by the app. This makes it easy to remember what you actually scanned.
SE: Right, so you can actually use it to determine what you will eat and make that yes or no decision, and then only store the scan of the things you have eaten.
IH: That’s correct.
SE: That’s good.
IH: You can store it and say, “Zero, I ate zero of it,” but still store it.
SE: Okay, that’s fascinating.
IH: When it’s ready to store, it asks, “How much of these did you eat?” If you’re diabetic, you may choose to use a scale.
IH: Then you can give a very accurate answer as to the weight of the food you just ate. This problem of how much you have eaten is a very interesting one because from our research we found that Europeans can estimate the weight very easily since they are used to seeing label information given always in grams. Every single label in Europe is per 100 grams. But this is not the case for North Americans; some labels are in grams, (per 100 grams, or per 30 grams), some are per serving, some per cup, some per ounce. North American people — and I suppose Australia could be similar — have a huge difficulty in estimating weight because of non-standardisation of the food label.
When we worked with those 25 diabetic people we asked, “You’re about to eat this muffin. How much are you eating?” It was amazing that most of them couldn’t give an answer that was close to accurate. Even in the imperial scale they could not give a good enough answer.
SE: I totally understand. If I bought a muffin from a coffee shop, and you asked me how many grams or ounces was in that muffin, I have no way of knowing because there is no label on it.
Weighty problem: how much did I eat?
IH: Yes, but you are not diabetic and therefore you have not had to do this regularly for years every time you ate so you could estimate your insulin needs. This is why I said that for diabetic people, ideally, they could carry a scale, a portable scale.
SE: That’s a bit much to carry around.
IH: Maybe Tellspec should have a scale inside. There are now in the market very small scales, as small as a pen. It opens up into a cross. In fact, there are now Bluetooth scales, so you can just connect it through Bluetooth with the scanner and you do not even have to enter the numbers. Ideally, the scanner should have a scale inside, and this is something we’re considering, not for generation one but probably later generations.
SE: But that would be hard in a restaurant. You get served a plate of food. You can’t just take the food off the plate and put it on a scale.
IH: I suppose you could account for the plate much the same way the sales person in a supermarket accounts for the container.
I foresee that in the long run, we won’t need any of this. We’ll be able to use image recognition; take a photo and from that photo calculate the weight of the food item. We’re not there yet, but I believe that we will be able to do that in a few years. There is ongoing research in this area in several universities.
SE: That would be really handy because I would imagine most people, if they don’t have severe allergies, wouldn’t be carrying a scale of any description with them. They would probably want to just wave the scanner at something and have it tell them, “Well, if you eat that muffin, that’s 800 calories, Shara. Do you really want to blow your calorie budget on one muffin, or would you rather have something else?”
IH: Your muffin that is very high in calories!
SE: Yes. I’m just making that up. No matter what it is, I was just thinking of something you would see: a plate of food, or a burger or whatever.
IH: Yes, but you’re right. The technology is not there yet, but is evolving. I have to clarify something when you talked about being in a supermarket and scanning some food and seeing some chemicals. Our technology in Tellspec doesn’t do that yet. What we offer today, in terms of detection, is still limited to calories, carbohydrates, fibre, proteins, fats, glycaemic index, insulinogenic load, sugars, and the presence or absence of gluten. That’s 10 different things.
IH: Soon, we’ll have soya, and milk, and peanuts. They’re coming. The beauty of it is that the scanner doesn’t change. We just simply upload the software.
Because we position ourselves as a consumer food safety company, people think, “They can do anything, so I can scan and see pesticides.” Well, no, we currently do not to have a model for pesticides. Not yet.
SE: I can’t wait until you do that because it would be so fascinating to wave the scanner over food that’s claiming to be organic to see whether it really is organic.
IH: Organic is actually one of the more difficult problems for us. We could find pesticides on it because the soil itself would have pesticides.
SE: Or something that’s blown from a neighbouring crop and the organic farmer can’t prevent the wind blowing something.
IH: Yes, or worse. Say you have food from South America, and part of my roots are from South America, Brazil, where there is no serious regulation for the use of pesticides. Brazil used to be one of the safe countries where food was really healthy but that safety is being destroyed slowly but surely due to the lack of regulations. I think 90 percent of the food today in Brazil has been grown with pesticides, which is really bad.
SE: That’s really heartbreaking. It really is.
IH: It’s heartbreaking, yes, completely. This problem of organic versus non-organic is really very difficult because there’s a lot of pesticides that countries with loose regulation will use. Tellspec intends to create a detection module for a limited number of the most used pesticides.
A much easier problem is genetically modified or not modified. The idea of organic/non-organic, believe me, is one of the hardest problems.
SE: Yes. I’d like to go back to this whole idea of GMOs in a while, but I also want to just go back to, first, the core of Tellspec. Can Tellspec be used for multiple people, say, like in a family or maybe in an aged care centre? Can you use the same scanner to store results for different people?
IH: Our intention is to do that. The app currently only allows one user, but our intention is to have a family pack of up to four people.
SE: That would make sense because you wouldn’t necessarily need a different hardware device to be used by different people.
IH: That’s correct. It’s all at the software level.
SE: Yes. You just need to be able to identify who is using it at a particular time. It may be a matter of pairing the device with different smartphones. If it’s paired to my smartphone that means it’s something that I’ve eaten. If it’s paired to somebody else’s smartphone, it’s for them. But in an aged care environment where, let’s say, a nutritionist or a dietician is trying to look at tracking things for different patients, you probably wouldn’t have multiple phones. You might just have one person who’s pairing it with a phone but wanting to track different people.
IH: Yes, and we could do that. It’s in our milestones. We will certainly do it, but not right now. The reason why we’re not going to allow more than four users is because we actually incur a cost when a scan is done. People think we don’t, but we do. We incur cloud costs on the interpretation and costs from sending it. We don’t want to make it completely free because then we could have suddenly huge cloud costs
SE: One scanner?
IH: Yes, on our cloud servers, which of course are not ours. We pay a third party for those cloud services.
SE: You might end up with different subscription models. Standard subscription might include four people, and the next level of subscription might include 10 because I still think that you could use the single hardware device to be able to track multiple people.
IH: Yes, correct. That’s exactly the plan. The plan is to have different subscription models.
Nanoparticles: little pieces making big problems
SE: Okay. Well, one of the things, Isabel, that I’ve been reading about that has me particularly concerned is the use of nanoparticles in many foods. From what I understand, legislation doesn’t require nanoparticles to be even listed on the label, and one of the concerns with nanoparticles that a number of scientists have is that supposedly they behave differently in the human body than their conventional-sized counterparts. Because they’re so small they can pass directly into the blood and be deposited in organs and cause cell damage and interfere with the immune system and all sorts of horrible things. I’m wondering if a device like Tellspec might be sensitive enough to detect nanoparticles or foods that have been modified in other ways like GMO, getting back to your earlier comment.
IH: In terms of nanoparticles, the infrared spectroscopy, which is what we use, is currently being used to detect these nanoparticles. However our detection algorithms are based on organic bonds so we currently are not tracking nanoparticles because they are not organic. This does not mean that we could not develop some detection algorithms that would work. It’s down the line.
We could look at GMO food and fairly easily build a model that could say “This food is GMO, and this food is not.” I believe this would not be difficult to do. We know that foods like tomatoes, when they’re genetically modified, contain more water, for instance. But there’s many other things that they contain. Are they seedless? There’s certain properties that those genetically modified foods have that would allow us to be able to differentiate them from something that is not genetically modified. So I believe we could detect GMO by indirect inference.
Is this in the schedule for things to be done? No, not yet, for the reason that there is not enough scientific evidence to actually prove or disprove that genetically modified foods are detrimental to human beings. There are some studies with animals, but there’s not enough studies that can conclude that it is in fact the case.
If this were a priority based on proven science, then it would certainly become a priority for Tellspec. I think there so much more that is a priority right now, things like: does the food have melamine? Melamine in China is used regularly to adulterate food, and we know that Chinese authorities crack down on groups adulterating milk by adding water and melamine into it. They kill children. People have been executed for this crime. But melamine continues to be used in China to increase the protein when something has been adulterated.
Most of the scans that are done to see if food is the right food and not adulterated are actually scans for proteins. Someone thought that if they put in melamine and it would go through a scanner that only tests the protein and no one would see it.
SE: That’s fascinating. I really get where you’re coming from in terms of doing your product development roadmap based on where you see the biggest market. I have read, however, about a lot of concerns from people who are worried about GMOs and I expect that in the course of time, regardless of the scientific evidence, there will probably be a body of people who would like to be able to use the scanner to find out if food contains GMOs. I guess time will tell whether there’s a business case for you in that area.
IH: It’s probably easier to do the models for genetically modified foods than for melamine.
SE: Isn’t that funny? But with a small company, or any start-up, you have to decide where the biggest market is, as opposed to what’s the easiest product development, because having an easy product development without a big audience makes it very difficult to pay for the R&D.
IH: Yes, exactly. We are currently working with a large agriculture company that has a problem with adulteration of their products in China. Their products are sold worldwide, and they are a very responsible company so they have decided they will test twice. They did the testing in Shanghai, because they built their own lab there, and then they did the testing again in their offices outside of China. They noticed that the Shanghai lab, although it’s their lab, would say, “It doesn’t have a certain substance” and the scary thing is that when they test in their other lab, the result was different
SE: Wow, that’s scary.
IH: Yes, so they realise that there’s backhanders being paid in their lab in Shanghai. So they approached us with this question: can you build us an app that can detect the presence of this substance with the use of your scanner? They want to test locally in China before they bring food into their lab in Shanghai.
SE: That would save a lot of money.
IH: Yes indeed.
Tellspec’s product roadmap
SE: Well, this is such a fascinating discussion. I think I could speak with you all day, Isabel. But before we conclude our interview today, I’d like to talk about where you’re at in the product rollout. From what I can see on your website, there are a couple of generations in the plans, the first being the beta version and then the second, which will be generation one, will be available early in 2016 for public purchase. Can you give us an idea where you’re at with the beta testing and the production unit?
IH: We have several scanners here, and we’re testing how they perform. We are improving these tests by making the scanners semiautomatic and delivering them slowly to the users, and testing scalability of the back office. Can we handle this many people scanning at the same time and getting results?
Of course we always find problems. Each problem is being resolved slowly but surely. The generation one is scheduled for some time in the summer of 2016, based on how well the beta testers do. Some people already have their beta scanners, and we had a problem with some batteries. We had a recall of the battery because the firmware did not have that capability to show when the battery was charged, so people were overcharging the battery, and the battery was dying. We resolved that problem. The new scanners with a light showing whether the battery’s full or not full are not yet out. There’s all these small problems that one has to deal with.
SE: Yes, it’s part of tech development. What’s the difference generation one and generation two?
IH: Well, we don’t do very well with liquids right now on generation one because generation one requires the sample to be in contact with the scanner window. We hope that generation two will allow us to do liquids fully.
SE: So if I’m scanning a meal, does the Tellspec scanner actually go into the food or just above the food? How does that work?
IH: It touches or gets very close to touching. With generation two you would put some of the food into a small container that will be part of the scanner, and then scan it.
SE: I’d love to get my hands on one of those and test it just to see how it works. That will be fascinating.
IH: Well, generation two is not going to be in 2016, I don’t think.
SE: I’d be happy to test generation one. So what’s your commercial model for generation one? How will that work if people want to buy a Tellspec scanner?
IH: Right now, we’re pre-selling them on our site. It’s first come, first served, and there’s a line-up. But of course that’s not the model. We are going to open up the distribution requests as of January and start negotiations with people, mid-January, after CES, the Consumer Electronics Show. We are fundamentally interested in companies and distributors or clinics. We have 10 different detection algorithms that we use currently, and they all focus on celiac and diabetes or pre-diabetes or obesity. We are encouraging groups that are actually doing work with clinics, or groups that sell to clinics that are in the area of obesity, weight loss, diabetes and so on to be the first ones for us to deal with.
We don’t see the scanner being in something like a consumer-goods store immediately, but it could be. In some countries, it will be that way. We have requests from groups in two or three countries in Europe that it make sense to go with right now because they have a huge reach, and it would be a consumer good.
I actually see Tellspec as almost the same as Apple. When Steve Jobs came up with the iPod, it was a solution to the problem that Napster was giving away music for free, and the studios that owned to rights to this music were upset at Napster. Eventually Napster was closed. Steve Jobs comes in the middle of this and says, “How can we satisfy the studios and at the same time the need that people have,” and so the iPod was invented for that. It was not invented because Apple thought going in to the music industry would be its future. No way. Then suddenly Jobs realised that he could actually develop a smartphone and that the phone could play music.
SE: Yes, and then it disrupted many industries at the same time.
The power to change the way we all eat
IH: Right, and now we have all the apps. Today Apple survives fundamentally on the apps and on the music. It’s not because it sells computers, right?
I think about Tellspec the same way. It’s going to be a huge evolution, but frankly you have generation one, which is a little bit clunky. It’s still a little bit big. It doesn’t do all the allergies. It doesn’t do all the chemicals that you want. It’s building slowly. Someday you’ll have generation two. It will do liquids as well. Then slowly but slowly, these things are becoming smaller and smaller, and eventually they will be in the smartwatch. Suddenly you will be scanning with your watch, not with your phone or even with the scanner. You’ll be able to identify more and more ingredients, and suddenly you’ll be finding the GMOs in your water, or the aflatoxins and so on.
Then people will start to change the way they consume their food. Knowledge that once belonged to the large food testing labs will be in the hands of the consumer.
What is the consumer going to do with this knowledge? Well, the consumer, first of all, is going to start to eat healthier food. In general, governments are going to have the ability to promote healthy diets because diet information will be something we will all have, and we will all understand. There will be a strengthening in the systems of food governance. The World Health Organisation has a system of food governance. It tells food producers they can only have so much fat; they can only have so much sugar and so on. So there’ll be a strengthening of these existing systems, and there’ll also be tightening of industry standards on diet and nutrition.
SE: Yes, and that’s so important, Isabel, giving people the power to make informed choices for themselves, rather than being told what’s good for them and what’s not good for them.
IH: Exactly. That’s the reason we have the encyclopaedia integrated into the app because it’s not enough to say, “Hey, you have acrylamide.” I mean, what the heck is acrylamide to start with? You’ve got to be informed.
SE: Exactly. I look at the labels at supermarket and I think, “Okay, that’s a chemical name. I’m not a chemist. I don’t actually know what that means for me and for my health, and I don’t have the tools to make the proper decision.” I suppose I could look up everything on websites as I’m trolleying around, but that gets really quite tedious.
IH: I’ll give you a perfect example of something that I find amazing that exists in food for celiacs. My daughter is celiac, and a lot of the celiac food has a product called xanthan gum. When you look up xanthan gum, and you do enough research on this, you go, “Oh, my goodness. There’s been enough scientific evidence to show that this product disrupts your gut flora.” All celiacs already have a problem with their gut flora. They have a problem with their intestinal lining or microvilli and suddenly they’re putting in something that just totally destroys it. That should not be allowed, and people are comfortably eating this and comfortable buying gluten-free and they don’t realise that most of the products that are gluten-free have xanthan gum. That’s an example of it. Essentially, people will be able to understand and monitor food quality and food safety, and that’s extremely important.
SE: There are reactions to food as well. It’s not just a case of being able to monitor food quality, but being able to see and have a record of how your body reacts when you eat certain things. If you buy a certain product and you always feel kind of icky afterwards, you may know, wow, that’s the culprit here.
IH: We have lost that. I think as a society, we have really lost that connection between our food and how we feel. Maybe not so much in less advanced societies, but in North America we certainly have lost it. People don’t feel well, and they think, “Okay, I’ll just take a pill,” or, “I’ll just go to the doctor.” They don’t pause to really think twice, until they’re much older, “Why is it that I’m not feeling well,” when they don’t realise it’s necessarily related to food.
Our immune system is the best defence we have against anything. We all have cancer in our body, all of us, but we’re able to fight that cancer because most of us have a strong immune system. Cancer takes a hold of us when our immune system gives up, so it’s very important to build a correct and strong immune system, and the bulk of the immune system actually starts in our gut. It starts with what we eat, with what we have inside. They’re still many foods that are good for us that we don’t use.
I give you another example of this that’s fascinating. Most of us suffer from H. pylori, a bacterium that is associated with gastric ulcers. There are populations, in North America I think, that have a 40 percent prevalence of this bacterium, and it’s very difficult to eradicate. The problem with H. pylori is that not everybody will develop gastric cancer, but the people that have predisposition to gastric cancer and that have H. pylori are at much higher risk of developing gastric cancer. Doctors then say, “Okay, let’s eradicate these H. pylori.” The eradication is done currently by a serious dose of antibiotics.
SE: That’s not good for you.
IH: Yes, it just kills everything, but the problem is that, in doing so, you destroy the gut flora and once you kill H. pylori, you kill some of the microorganisms that H. pylori eats. H. pylori is necessary in our body because it eats some of the microorganisms that can be very dangerous. Some of them are link to certain neurological disorders. So by eradicating H. pylori you are also killing a bacterium that is useful to us, one that kills microorganisms linked to very serious disorders.
But H. Pylori can be controlled without the use of antibiotics and without eradicating it totally, and the best news is that this can be done with food. Turmeric, the spice used in a lot of Indian food, is a perfect example of how food can be used to help our health. Turmeric keeps H. pylori from growing disproportionately. Turmeric doesn’t kill all the H. pylori. It just keeps it in balance. It doesn’t allow it to reproduce like crazy. Because it’s an anti-inflammatory agent, taking turmeric regularly is very healthy for you.
Turmeric continues to be used in India and people believe in it without understanding why, but that interaction between what we ate and how we felt is lost. Most of our grandmothers used to say, “Well, take a little tea with lemon and honey when you have a cold.” All those small little homemade things, we’ve lost because we’ve become a society of mass-produced food.
By Simon A. Eugster (Own work) [GFDL] via Wikimedia Commons
SE: And we don’t know what’s in the food. That’s the biggest problem people have. They want to eat healthy, but they don’t actually realise that they’re eating foods that aren’t healthy.
IH: Yes. The Internet of Things, where things relating to food are about to be born. It’s got to be born. It’s essential that we have this knowledge, it’s as important as our geographic knowledge that we have with Google Maps. Wherever we drive we can Google Map it, and we have the path. It will be the same. We’ll have a foodprint, not a footprint, a world map of what people are eating, where they’re eating, what diseases are associated with different areas. We can improve humankind and health by providing that information. I look at it as the same way as mobile phones have developed. They were very hard to charge, very expensive, very chunky.
I keep wishing that Tellspec generation one had a look and feel that was sleek and womanlike because I see this as a technology for women. I don’t see this as a technology for men. I think we women, we care about what we eat, and we care for what our children eat much more than men do.
SE: I know a lot of very health conscious men, too. When I’ve described Tellspec, they are just fascinated and want to know more.
IH: Yes, but they’ve got to be young, right?
SE: Yes, a lot of them are.
IH: Yes, exactly. That’s what our research showed. Those aged 20 to 40 are in tune. 40 and older, 45 and older, not so much, but the women all the way: the 20s, the 30s, the 40s, the 50s, all the way: “I want to know what’s in my food. I want my children to know.” That’s why I’d like Tellspec to look much sleeker than it looks right now. I may still come up with a sleek design — or meet some wonderful designer that comes up with the solution.
SE: And that may well be someone who reads this interview, listens to this.
IH: Maybe. I think it should be an object of art. I think it should be a functional object of art. I love the way our iPhones look so sleek. Different colours, they’re thin; they’re big; they’re whatever. They look nice. Apple has managed to do something really good with their products. Some of the Samsung products are nice products too. They’re products people want to have.
SE: Yes. I’d love to have Tellspec in a ring that I just wave over food and get a result, whether it be on a smartphone or eventually on smart eyewear that will be completely integrated with my vision.
SE: Isabel, this has been such a fascinating conversation, and I’m very much looking forward to your commercial launch in early 2016. I’d love to try a Tellspec scanner.
IH: Thank You. Hopefully you will, and soon.
SE: Okay, thank you so much.
IH: Thank you, Shara.
About the author: Shara Evans is internationally acknowledged as a cutting edge technology futurist, commentator, strategy advisor, keynote speaker and thought leader, as well as the Founder and CEO of Market Clarity.
Get in touch now to check Shara’s availability to speak at your next event. Shara works closely with her clients to ensure all her presentations are tailored to your event – ensuring maximum impact for your business.