0 13 mins 4 weeks


Angelp | Istock | Getty Images

Think a friend or colleague should be getting this newsletter? Share this link with them to sign up.

Good morning! After a two-year dry spell, initial public offerings by biotech companies showed signs of life during the first three months of 2024. 

But it’s too early to say that the biotech IPO market has fully recovered.

Biotech IPOs appeared to reach pre-pandemic levels during the first quarter, with nine companies collectively raising more than $1.3 billion, according to a database from BioPharma Dive. That is more than three times the roughly $375 million raised from biotech IPOs in the first quarter of 2023. 

Here are the companies that went public in the first quarter of 2024, according to the BioPharma Dive database: 

  1. CG Oncology – 1/24, raised $380 million
  2. ArriVent Biopharma – 1/25, raised $175 million 
  3. Alto Neuroscience – 2/1, raised $129 million 
  4. Fractyl Health – 2/1, raised $110 million 
  5. Kyverna Therapeutics – 2/7, raised $319 million 
  6. Telomir Pharmaceuticals – 2/8, raised $7 million
  7. Metagenomi – 2/8, raised $94 million 
  8. Chromocell Therapeutics – 2/15, raised $7 million 
  9. Boundless Bio – 3/27, raised $100 million 

Another company, Contineum Therapeutics, went public on April 4 and raised $110 million. 

Six of the nine IPOs to price between January and March raised $100 million or more. Kyverna Therapeutics and CG Oncology raised $319 million and 380 million, respectively. The latter now trades well above its IPO price. 

But the last few biotech companies to price IPOs in the quarter “haven’t traded so well,” Mike Perrone, Baird’s biotech specialist, told CNBC. 

For example, gene editing drugmaker Metagenomi priced at the bottom of its projected price range in February, and has since lost more than half of its value. That’s adding skepticism about the prospects for the biotech IPO market the rest of the year.

“We kind of started Q1 with a roar and ended with a whimper,” Perrone said.

The issues in part reflect the Federal Reserve’s decision to wait longer than previously expected to cut rates following a series of surprisingly high inflation readings, he said. 

“A lot of the early biotech IPO enthusiasm this year was on the back of expectations of earlier rate cuts, and risky assets like biotech with longer-dated cash flows bode well during rate cut environments,” Perrone said. “But as inflation has remained sticky and as the Fed has continually pushed out rate cuts until later this year, I think some of that enthusiasm has come off.” 

So, what will biotech IPO activity look like for the rest of the year? 

A typical “strong year” looks like about 50 IPOs based on the last 10 years, according to Arda Ural, EY’s Americas industry markets leader in health sciences and wellness. The biotech sector isn’t on pace to meet that number, with only 10 IPOs well into 2024. 

“Things will probably stay below the normal year,” Ural said. But that may change, he noted. 

If the Federal Reserve starts interest-rate cuts as early as its late-July meeting, “you’re looking at a different second half of the year for IPOs … it will certainly send us in a very positive direction,” Ural said. 

He called it “delayed cautious optimism.”

Morsa Images | DigitalVision | Getty Images

By comparison, biotech IPOs had a landmark year in 2021 as the wild success of Covid vaccines and therapeutics during the pandemic renewed investor optimism. Roughly 110 biotechs priced an initial offering and collectively raised around $15 billion that year. 

But that momentum began to stall in 2022 and move at a snail’s pace in 2023: The biotech sector only saw 22 and 19 IPOs during those years, respectively. 

The Fed’s interest rate hikes were a big driver of the downturn, according to Perrone. He said the poor performance of new publicly traded companies also contributed, specifically due to a higher number of clinical trial failures. 

Notably, most of the drugmakers that priced offerings between 2020 and 2022 were in preclinical or early-stage clinical testing, which Perrone called “abnormal.” 

“I’d say the downturn was a combination of both interest rates starting to increase and all these young companies having higher than average failure rates,” Perrone told CNBC. “That kind of soured the market.”

The good news about this year is that the vast majority of biotech companies that have priced IPOs so far have tested their products in humans to some extent, reflecting an investor shift toward safer bets. Perrone called that a “healthier situation” and a more “normalized environment.” 

But the bottom line is that we’ll have to keep “rate-watching” to see what the pace of biotech IPO activity will look like moving forward, Perrone said. Stay tuned for our coverage in this area. 

Feel free to send any tips, suggestions, story ideas and data to Annika at annikakim.constantino@nbcuni.com.

Latest in health-care technology

Doctors are using VR and AI to hone their skills. Here’s what it’s like

Hands, tablet and doctor with body hologram, overlay and dna research for medical innovation on app. Medic man, nurse and mobile touchscreen for typing on anatomy study or 3d holographic ux in clinic

Jacob Wackerhausen | Istock | Getty Images

Last week, I spent an afternoon at Weill Cornell Medicine in New York City to explore the metaverse with Dr. Rohan Jotwani and Dr. John Rubin. 

Jotwani and Rubin are anesthesiologists at the medical center, and they also serve as co-directors of the Extended Reality Anesthesiology Immersion Lab, or XRAIL. 

Anesthesiologists are doctors who specialize in pain management, critical care medicine and, of course, anesthesia, the use of medications that help keep patients comfortable during procedures. It’s a crucial specialty that requires doctors to use both technical and emotional skills, as working closely with patients in pain can be challenging. 

XRAIL was founded last year to help anesthesiologists and anesthesiologists-in-training hone their abilities. Jotwani and Rubin believe tech like virtual reality and artificial intelligence can improve medical education and clinical practice within the specialty.

For instance, the pair has designed a series of lessons to help doctors learn and practice procedures by working with 3D models in VR headsets. I observed one of these lessons between Jotwani and Dr. Chrissy Cherenfant, Weill Cornell’s chief resident in anesthesiology. 

Before the lesson got underway, we all met in a room in the medical center to get acquainted with each other and our headsets. The lab primarily uses headsets from Meta (we put on the Meta Quest 2), but it is also exploring use cases for Apple’s new headset, the Vision Pro. Cherenfant and I had never used a VR headset before, and I felt like we both got the hang of it fairly quickly. 

Though we were all together in the same space, the headsets can be used remotely, which means doctors can meet up in VR even if they are in different places. XRAIL has a handful of headsets it can provide to residents, and Jotwani said a class of around six to eight people is usually the sweet spot. 

The experience was immersive as soon as I put on the headset. Once we all joined the session, I could see Jotwani and Cherenfant’s avatars, as well as a 3D model of a spine in the middle of the room. Cherenfant and I watched as Jotwani made the model bigger and smaller, picked up individual bones and muscles, turned them at different angles and drew in the air. 

I thought the model was a helpful way to break down complex concepts and get into the nitty gritty of anatomy, which would be hard to do with just 2D images in a textbook. It was easy to see how the headset could serve as a useful educational tool.

“I wish I had this when I was an intern,” Cherenfant said during the lesson. 

The technology is far from perfect. Sometimes the avatars would get in the way of the model, blocking my view. If I could see something from where I was sitting, Cherenfant often couldn’t, so getting the positioning exactly right was tricky. On occasion, the model would suddenly appear giant, the environment would look pixelated or fuzzy and we’d get kicked out of the session by accidentally stepping out of bounds. There are some issues to resolve. 

The headsets also don’t replace the feeling of a procedure – what it’s like to put a needle through skin, for instance. Holding a controller is not the same thing as using a medical instrument. 

Even so, VR is an easy and relatively inexpensive way for residents to practice a surgery as many times as they’d like. Jotwani and Rubin believe it beats relying on images, videos and trips to the cadaver lab, which can be few and far between. 

While XRAIL is using VR to help teach technical skills like how to carry out procedures, it is also using AI to teach soft skills like how to speak to and listen to patients. Jotwani and Rubin, who are not engineers by background, have built around 10 different AI-powered conversational agents that doctors can practice speaking with in real time. 

Jotwani said Weill Cornell has usually done this by hiring actors to simulate some situations that physicians may encounter. It’s a time consuming and pricey endeavor, as it can take upwards of eight hours to train the actors and ensure that their portrayals are realistic. 

The actors also stick to a script, which means there is only so much they can do, Jotwani added. The conversational agents, by contrast, can have more free form discussions. 

After Jotwani booted up his computer, I “met” an agent named CARL, which stands for Conversational Agent Relief Learning in Pain Management. We chatted about the chronic pain CARL experiences, and he told me details about his history and his life, down to the couch in his virtual NYC apartment. 

I was really impressed – maybe even a little unnerved – by how natural the conversation flowed. CARL is just a computer model, but it seemed like he had a personality, and I thought he was able to convey feelings like frustration and discomfort convincingly. 

There is a little bit of a lag in CARL’s responses, probably somewhere between one to two seconds, so the conversation isn’t exactly like speaking with a real person. I was also instructed to make sure I spoke using complete sentences, so I was more conscious of my words than I normally am. 

But once again, it was clear to see how CARL could serve as a valuable way for doctors to practice engaging with patients in a risk-free way. I’d personally rather have my doctor ask the wrong questions to an AI agent than to me. 

“We’re really interested in building more models like CARL, models that sort of challenge our residents to think beyond just how do I pass this exam to how do I treat real life people with complex stories,” Jotwani said.  

Jotwani and Rubin are just getting started, and they’re already being asked to speak about their work regularly. Over the next couple of years, they plan to expand XRAIL’s capabilities and bring the technology to other organizations. 

“I think that there are a lot of opportunities,” Rubin said.

Feel free to send any tips, suggestions, story ideas and data to Ashley at ashley.capoot@nbcuni.com.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *