Leonardo Falaschini
UX+UI Product Designer

Digital Product Designer with over a decade of experience in UI, UX and Interaction design for mobile apps, having delved into XR (VR/AR), data visualization, and B2B web-based tools. I'm passionate about crafting user flows, mockups, animations, and clickable, high-fidelity prototypes. I have experience devising design strategies, and capacity planning, in adherence to agile best practices or mentoring junior designers and exchanging knowledge with peers and stakeholders.

Design Lead, UX, UI, Branding, VR/AR

Touch Surgery

Touch Surgery (Later rebranded Digital Surgery) is a surgery training simulation, for all levels of medical practice, contained in a mobile app. It holds a vast library of procedures and articles from renowned sources such as Stanford and Imperial, allowing users and program directors to track the learning progress and performance.

Design Lead, UX, UI, Branding

HiCarer

I conducted with the team User Story Mapping to understand and parse user needs and prioritise solutions for both people in need of care, and caregivers (those doing it professionally or the wider support circle like family and friends).
Concurrently, I developed a consistent Brand design from scratch, which of course was to be applied to a growing and complex UI Design System.

Design Lead, UX, UI, Branding

Kirontech Health Insurance Platform

Kirontech helps health insurance payors tackle fraud, waste and abuse in claims. Driven by AI and ML. Backed by medical and specialist fraud experts.

In compliance with a non-disclosure agreement with Kirontech; I can only provide a live presentation via remote screen-share, or in person. Please click below to contact me, or through your recruiter, to schedule.

UX, UI, Interaction Design

SAM Labs' mobile visual programming

As a UI / Interaction Designer, I delivered a series of advanced clickable prototypes to effectively communicate design concepts, and facilitate internal discussions in the development process.The purpose of the app was to provide an intuitive visual programming tool for a set of IoT modules like servos, buttons, lights and sliders (Later rebranded SAM Labs App STEAM Kit). It was to be crafted to engage users of all ages, offering a seamless and immersive experience in controlling the hardware set.

UI, UX Designer

Digital Garage / Damage Capture

Digital Garage has been created to assist users in maintaining awareness of their vehicle's condition, simplify the recording process, and estimate the damage to file a claim with their insurer.With an emphasis on user experience while catering to the unique requirements of a broad and varied target audience.

Contact me

Send me an email and I'll get in touch as soon as possible.

Thank you, I'll be in touch!

Design Lead, UX, UI, Branding, VR/AR

Touch Surgery
Mobile App

Touch Surgery (Later rebranded Digital Surgery) is a surgery training simulation, for all levels of medical practice, contained in a mobile app.
The scoring system would ultimately count towards their clinical or medical educational credits supported by institutions like Imperial College or Stamford.

The app would have (and indeed end up having) two modes, Learn and Test In the Learn mode, users would have access to a library of CGI content representing a surgical procedure. This content would be designed to be engaging and interactive to ensure that users retain information, and provide visible score to represent their progress.

To ensure commercial viability, we created with stakeholders a business model canvas to identify (and its pitfalls) potential revenue streams, including in-app purchases, subscription models, or partnering with medical device companies.

We outlined user personas and mapped User Stories and/or Needs to define the journey and flow of the app; and outlining the app’s structure and how users will navigate it. (Above)

Thinking with our hands by deploying the map of the app on a whiteboard is one of the best strategies I would recommend. It allowed us to empathise with user feedback that regularly came from Med Students and Authors.

An Atomic Design System ensured consistency across the app and make it easier to iterate and scale as needed. The system would involve reusable components such as buttons, icons, and form fields that could be amalgamated and reused for the formation of novel interfaces when necessary.

Design Lead, Game Design, UX, UI, Branding, VR/AR

Touch Surgery Simulation

During my work at Touch Surgery, I had the opportunity to design a Heads Up Display (HUD) UI layer for a 3D surgical simulation. The use of a smartphone's gyroscope would add that extra layer of complexity and realism that the 2D modules lacked.

When we put together the team to develop a 3D Simulation Module, the app already hosted many pre-rendered 2D content. While the game mechanics remained similar (and yet needed an upgrade), the UI necessitated an update to meet the requirements of a 3D or XR simulation, which could in turn inform an update into the existing 2D Educational modules.

In response, I designed a set of Cursor-To-Target mechanics that would represent different surgical actions, like suturing, diathermy, forceps retraction, swabbing, etc. The sample below is a sample animation I created on Adobe After Effects.

The HUD included a timeline that dynamically expanded and collapsed to show all stages of the procedure, from preparation to completion. The tool set displayed surgical instruments, and camera/gyroscope controls allowed for flexible navigation. Additionally, an information panel provided instructions and suggestions to help guide the user through the process.The video below shows an early stage of the prototype, with most functionalities in place. This prototype has initially tested on Oculus Rift over Unreal Engine and was later implemented into a Augmented Reality HoloLens prototype, keeping most design functionalities intact.

Design Lead, UX, UI, Branding, VR/AR

Touch Surgery R&D
Google Glass Training Companion

The emergence of Google Glass, while premature and shortlived, sparked a wave of R&D projects focused on leveraging Augmented Reality in the workplace. These efforts aimed to utilize voice commands and gesture recognition, instead of touch-sensitive peripherals, to perform tasks in environments where such peripherals are impractical.

The Operating Room provides a prime example of such an environment. Although the technology was originally developed for use in situations such as the battlefield, training, and remote assistance, its potential in this context remains largely unexplored.

The video above showcases a real-life surgery, but the footage has been edited to omit graphic content. It provides a glimpse of the basic functionality of the assisting UI during the marking and planning stage.Below is part of the process of storyboarding a sample procedure and User Journey, the product of one-on-one qualitative research, namely interviews with surgeons of Imperial College and Guys' Hospital. This allowed us to develop a test module that could be run on Google Glass, complete with voice commands for hands-free operation.

Design Lead, UX, UI, Branding

HiCarer

I conducted User Story Mapping to understand and parse user needs and prioritise solutions for both people in need of care, and caregivers (those doing it professionally or the wider support circle like family and friends).
Concurrently, I developed a consistent Branding from scratch, which of course was to be applied to a growing and complex UI Design System.

In the initial phase of the design process for the app catering to carers, individuals in need of care, and their social circle, user story mapping played a crucial role in understanding user needs and experiences. I proposed this framework to the team as to identify provisional user personae and journeys, and to correspond these with key features, such as the care diary, medication reminders, timesheets, and a job market for carers and clients to connect.

The second stage of the design process involved developing clickable wireframes, which allowed us to refine the app's layout, navigation, and overall user experience.These wireframes offered an interactive representation of the app, making it easy for the client, stakeholders, experts, field practitioners, and representatives from relevant organizations, to provide early, valuable insights and help us make any necessary changes, including the design of a complex questionnaire or Care Needs Assessment, which featured branching sub-questions (below).

I was happy to engage my inner illustrator and deliver custom illustrations and iconography, which conveyed a gentle and amicable tone without being overly cheerful about the subject matter. The client qualified this as "sweet" and on-point, by remaining minimalistic and inclusive, resonating with users of all genders and age groups.

In parallel, I focused on creating a comprehensive, platform-agnostic Atomic Design System. I promoted UI best practices and ergonomic standards such as GOV.UK's design guidelines; putting attention on high readability for visually-impaired users. I later applied this design system to the entire app and a marketing website (below).

UI, UX Designer

Digital Garage / Damage Capture

Digital Garage has been created to assist users in maintaining awareness of their vehicle's condition, simplify the recording process, and estimate the damage to file a claim with their insurer.With an emphasis on user experience while catering to the unique requirements of a broad and varied target audience.

In designing the app’s dashboard, I prioritized the accessibility of critical information, such as parts or accessory recalls, checkups, and MOT due dates. I designed the dashboard to provide a user-friendly experience, with the integration of different in-built connectivity systems. Additionally, I delivered a collection of rapid, interactive UI prototypes (right), utilizing Principle App, to facilitate internal discussions and design refinements.

The other aspect of my design involved designing a 3D environment on the Unity3D engine that enabled users to visually document and represent the damage to their vehicles. It provided users with the capability to ‘paint’ the damage, edit, rotate, and scale the visuals to depict the extent and estimate the cost of the damage accurately. Additionally, users could capture the damage using photos, all in all, offering a method for documenting and assessing the damage before submitting claims to their insurance providers.

Design Lead, UX, UI

SWGfL's
Minerva

Aenean eu leo quam. Pellentesque ornare sem lacinia quam venenatis vestibulum. Duis mollis, est non commodo luctus.
Minerva Link

From the above (low-resolution) wireframes, I progressed into mid-resolution mockups. While the protagonist remained a Query or Search page, an alert system and support for comments, I also explored claim input forms. Examining this route was still of great value; even if we were no to provide an interface for manual, individual claim entry (most claims processing is already digitalized, and processed in bulk).There was an inescapable need, and much to gain, in gathering input from our in-house counter-fraud specialists and data scientists, and then compare notes with clients. This helped us add more detail into the data structure (AKA domain model) of a Medical Claim and pair it to ICD (International Classification of Diseases) Codes.

Design System

UX, UI, Interaction Design

SAM Labs' mobile visual programming

As a UI / Interaction Designer, I delivered a series of advanced clickable prototypes to effectively communicate design concepts, and facilitate internal discussions in the development process.The purpose of the app was to provide an intuitive visual programming tool for a set of IoT modules like servos, buttons, lights and sliders (Later rebranded SAM Labs App STEAM Kit). It was to be crafted to engage users of all ages, offering a seamless and immersive experience in controlling the hardware set.

At the outset of the design process, I was presented with an established brand that had not yet been implemented on a mobile platform. During this pre-multiplatform era, we prioritized developing an iOS app as a proof of concept. In the first stage, I focused on designing and prototyping the main navigation elements, ensuring seamless transitions between the user profile, homepage, and the visual programming canvas.

Initially, low-fidelity prototypes (below) facilitated rapid iterations with the team and encourage feedback from prospective users, including children. This and a previous version of the desktop app were both tested in a two-day session at the International School of Brussels (Photo is only illustrative), in which I participated conducting qualitative research with students ranging from ages 8 to 17.

With this insight, I progressed to crafting high-resolution prototypes, incorporating more intricate interactions for connecting the IoT modules and establishing behaviours and triggers between them. These hi-res prototypes helped visualize the complete user experience and refine the final product, ensuring that it was both visually appealing and functional.This stage allowed me to refine the interactions and create a bottom sheet inventory that housed the programmable IoT modules, providing a solid foundation for the subsequent stages of design, scalable to tablet and desktop versions.