Another week of riddles ahead: will the trade war flare up or not? The EU can hardly back down in the face of Trump’s threats unless it really wants to lose its credibility entirely. How will Trump react if retaliatory tariffs are imposed or if the mutual trade agreement is considered void? If competition from American companies in the EU begins to be restricted, what will happen reciprocally across the pond? Three weeks until the earnings report, when we will also get the first guidance for this year; will we finally see a shift into extremely strong growth? The Big Pharma announcement is still pending. The moment of truth is approaching—merely continuing strong growth would be a disappointment.
I believe the country of origin for Optomed’s cameras was Thailand and the AI comes from Israel, so tariffs likely won’t hit Optomed…. I agree that growth needs to start showing now, otherwise the momentum will start to slip away. Signals are starting to appear here and there; now we just need big numbers, customer accounts, and partnerships to be released and voila ![]()
According to this morning’s fresh LinkedIn update from Optomed, Lumo+Aire is immediately available in Europe. Approval came last week; fast action.
Some optician chain is advertising OCT (retinal?) scanning on TV in all its stores. What kind of device do they have for that imaging?
Tabletop camera. OCT requires such high resolution that no handheld camera can beat it yet, at least.
OCT is a completely different type of imaging for different use cases. It captures a very precise 3D structure of the fundus from a small area. Traditional fundus photography, on the other hand, captures a wide view of the fundus.
As a layman, I’d hazard a guess that OCT is better for precise diagnostics or detecting disease progression (changes), while fundus photography is for AI screening.
In Europe, it is indeed sufficient that the camera and the AI have separate approvals. Combining them doesn’t require any new approval, so in this regard, our regulation is lighter than in the US. There are, however, other issues here, such as a fragmented and so far small market.
Yeah, the market is being built for first-line AI retinopathy. I mainly wanted to say that Aire’s integration was ready for sale immediately once approval was received.
That is somewhat in conflict with the wording in Optomed’s press release. One could understand from it that specifically the algo+camera combo received approval:
”At the same time, Optomed’s flagship product, Optomed Lumo®, has been approved for integration into Aireen’s artificial intelligence (AI) algorithm for diabetic retinopathy screening in the EU market.”
This is (another) thing that could be asked in the financial statement interview, i.e., whether the CE approval practice has changed in this regard.
There’s certainly some paperwork and bureaucracy involved when combining them, but in Europe, you don’t need to conduct clinical trials as a combination to prove the functionality of the camera+AI combo. In other words, approval in Europe vs. FDA approval are completely different matters.
Generally, it was a pointless comment on my part; I apologize for the nitpicking. We can clarify these points with management in the interview after Q4 if it’s deemed necessary and there is time left.
Copilot provides good background information; it seems that the combination requires its own approval:
——
Q: Do AI and the device using it need joint CE approval when used together? Or is it sufficient that the AI and the device are separately CE-approved? This law apparently defines the grounds for approval: L_202401689EN.000101.fmx.xml
A:
Short answer: If the AI system and the device together form an entity that meets the definition of a high-risk AI system or a product where AI is a safety-critical component under EU regulation, they require a joint CE marking. If, on the other hand, the AI and the device are functionally separate and both meet their own requirements, separate CE markings may suffice. This depends on whether their combination creates a new “product” or a “high-risk AI system” according to the EU AI Act (europa.eu).
Below is a clear and practical breakdown.
When do AI + device need joint CE approval?
- If AI is part of the device’s safety-critical operation
The AI Act defines high-risk AI systems, among others, when the AI is part of a product covered by EU product safety legislation (e.g., MDR, IVDR, Machinery Regulation) europa.eu.
If the AI affects the device’s safety, performance, or clinical decision-making, the AI + device form a single entity → joint CE marking is required.
Examples:
• Diabetic retinopathy screening device where AI makes the diagnosis → joint CE
• A machine whose safety functions are based on AI-powered sensor fusion → joint CE
• An IVDR diagnostic device where AI interprets samples → joint CE
- If the AI is integrated into the device such that a new “product” is created
The AI Act states that an AI system can be:
• embedded (built into the device) or
• non-embedded (serves the device’s operation externally) europa.eu.
If the addition of AI changes the device’s operation, risks, or intended purpose → a new product is created, requiring a new CE assessment.
When are separate CE markings sufficient?
- If the AI and the device operate independently of each other
If the AI does not affect the safety or performance of the device but acts only as an additional service, separate CE markings may be sufficient.
Examples:
• A camera is CE-marked, and AI analyzes the images in a separate cloud service but does not make a diagnosis or safety-critical decisions.
• AI provides only reporting, statistics, or optimization without affecting the device’s risk profile.
- If the AI does not belong to high-risk categories
The AI Act defines high-risk AI systems precisely. If the AI does not belong to these categories, it does not require a CE marking based on the AI Act.
How does the AI Act (EU 2024/1689) define the situation?
The AI Act clearly states:
AI systems that are part of a CE-marked product (e.g., MDR/IVDR/Machinery Regulation) inherit the product’s conformity 9F742443-6C92-4C44-BF58-8F5A7C53B6F1.
→ This means that the combination of AI and the device is assessed as a single entity.
AI systems that are independent are assessed separately according to the AI Act.
→ In this case, a separate CE marking may be sufficient.
If the addition of AI changes the device’s intended purpose or risks, a new conformity assessment is required.
→ This is the same logic as in the MDR/IVDR world.
Practical example (relevant to you)
Since you work a lot with diabetic retinopathy screening and AI diagnostics:
If:
• the camera is a CE-marked device
• the AI is a CE-marked diagnostic system (IVDR)
• the AI makes a diagnosis or affects clinical decision-making
The combination forms a high-risk AI system, and the entire entity requires a joint conformity assessment.
This is the same logic that the FDA uses for “device + SaMD” (Software as a Medical Device) combinations.
Summary
| Situation | Joint CE required? |
|---|---|
| AI affects the device’s safety or clinical decision-making | Yes |
| AI is part of a product under MDR/IVDR/Machinery Regulation | Yes |
| AI changes the device’s intended purpose or risks | Yes |
| AI operates completely separately and does not affect the device’s risk profile | No |
| AI does not belong to high-risk categories | No |
Would you like me to concretely evaluate a specific device + AI combination?
I can provide you with a detailed analysis of:
• risk classification (AI Act + MDR/IVDR)
• conformity path
• documentation requirements
• manufacturer vs. deployer responsibilities
Just let me know which device and which AI are involved.
Why is the EU a difficult market? Don’t all member states here have the same directives? At least in my own work in industry, the EU is the easiest export region of all.
In the US, there are incentives that make the acquisition and use of AI devices cheaper for providers, as they receive a set reimbursement (x amount) for every screening. This makes the purchase attractive.
Of course, these exist in some European countries as well, but generally not as explicitly as the CPT and HEDIS codes in the US. Europe usually lags behind in the adoption of health technology and support schemes.