System For Capturing The Movement Pattern Of A Person

Invented by Groß; Horst-Michael, Scheidig; Andrea, Trinh; Thanh Quang, Schütz; Benjamin, Vorndran; Alexander, Bley; Andreas, Mayfarth; Anke, Arenknecht; Robert, Trabert; Johannes, Martin; Christian, Sternitzke; Christian, TEDIRO Healthcare Robotics GmbH
In this article, we will explore a patent application for a computer-implemented system and method that captures and evaluates a person’s movement patterns—particularly those involving body elements like limbs—using non-contact sensors and advanced software. We will break down what makes this technology important, how it builds on prior inventions, and what new features it introduces.
Background and Market Context
Healthcare systems around the world are under pressure. There are not enough trained therapists and caregivers to meet the growing needs of patients, especially in rehabilitation. When people recover from injuries or surgeries, like a hip replacement, they need help learning to move correctly again. This helps them avoid pain, reduce the risk of further injury, and recover faster. However, with fewer professionals on hand, some patients do not get enough attention or training. This can lead to poor recovery, longer hospital stays, and even new health problems.
Hospitals and clinics also face another problem: they need to keep detailed records of patient progress and therapy. This is important both for tracking recovery and for legal reasons—to show they did their job properly if there are ever complaints.
Technology is stepping in to help. Service robots and computer systems are being used more and more in hospitals and rehab centers. These robots can help patients practice walking, climbing stairs, and doing other movements, even when a therapist is not present. They can also keep accurate records of every session and help standardize care, so all patients get the same level of support no matter which therapist is on duty.
But there are challenges. Many existing service robots only help with basic tasks like delivering food or guiding patients through buildings. Some can monitor simple health signals, like heart rate or whether a patient has fallen. Others can help with basic exercises, but they do not always track movement with enough detail to give meaningful feedback. This is especially true when patients use walking aids like crutches, which make movement more complicated.
The invention discussed in this patent application addresses these issues. It describes a system that can watch how a person moves using cameras or other sensors, build a virtual “skeleton” model of their body, and compare their movement to a standard or correct pattern stored in memory. If the person moves incorrectly, the system can notify them right away and suggest corrections. It can also recognize walking aids and factor them into its analysis, making it much more useful for real rehabilitation situations.
This type of technology is especially valuable in settings where therapists are busy or in short supply. It can help patients practice more often, get immediate feedback, and track their progress over time—all while freeing up staff to focus on the most complex cases.
Scientific Rationale and Prior Art
The idea of using technology to help with physical rehabilitation is not new. Over the years, researchers and inventors have created many kinds of robots and computer systems for hospitals. Some robots deliver medicine, guide patients, or even record simple exercises. Others use sensors to check vital signs or detect when a patient has fallen.
A key step forward was the use of cameras and depth sensors—like Microsoft Kinect or 3D cameras—to track how people move. For example, studies have shown that Kinect-based systems can measure joint positions and walking patterns almost as well as expensive, stationary lab equipment. These systems can track things like step length, speed, and how much time each foot spends on the ground. They can also measure angles at the knees, hips, and trunk.
Some robots have also used laser scanners (LIDAR) and 2D cameras to find and follow people around hospitals, or to recognize when someone is sitting or standing. There are even robots that can recognize walking aids, like crutches, using lasers or cameras.
Despite these advances, there have been gaps. Most earlier systems could not give detailed, real-time feedback about how a patient was moving. They might record data for later review, but they could not compare a person’s movements to a standard pattern in real time. Few could recognize when a patient was making mistakes with walking aids, or provide instant corrections based on what they saw.
Other systems relied too much on therapists to interpret the data, or needed special markers or sensors attached to the patient’s body, which could be uncomfortable or impractical in real-life settings. There was a need for a smart, automated system that could watch, analyze, and guide patients during movement exercises without needing to touch them or attach anything to their bodies.
Prior patents and research mentioned in this application include:
– Robots that help with logistics or basic care tasks, like delivering supplies or guiding patients.
– Robots that can monitor basic health signs, detect falls, or record short video clips.
– Systems that can track people using cameras or laser sensors, and recognize whether someone is using a walking aid.
– Studies validating the use of Kinect and similar systems for gait analysis, showing they can measure key movement parameters with enough accuracy for clinical use.
The new system builds on these ideas, but goes further. It offers detailed, real-time analysis of movement patterns, supports both patients and therapists, and can be used with or without walking aids. It also includes machine learning features, so it can get better over time by learning from more patient data.
Invention Description and Key Innovations
At its core, the invention is a smart system—often embodied as a service robot—that watches people as they move and helps them improve. Let’s break down how it works and what makes it special.
How the System Works:
– The robot or system uses cameras, like a 2D camera, a 3D depth camera, LIDAR, or even radar or ultrasound, to watch a person as they walk, climb stairs, or do other movements.
– As it watches, the system turns the images into a virtual “skeleton” model. This model marks the positions of key body parts: head, shoulders, arms, hips, knees, and feet.
– If the person uses a walking aid, like crutches, the system also tries to find and track these aids.
– The system compares how the person is moving—step by step—to a standard movement pattern stored in its memory. These standard patterns may come from healthy people, from medical guidelines, or from the patient’s own earlier sessions.
– If it detects errors or differences—like taking steps that are too short, leaning too far forward, or using a crutch incorrectly—the system can notify the person right away. The notification could be a spoken message, a display on a screen, or even a projected image on the floor.
– The system can prioritize which errors are most important and may only alert the user to the most serious issue at any given time, so they are not overwhelmed.
Key Innovations:
1. Non-Contact, Real-Time Movement Analysis:
Unlike systems that require special markers or sensors attached to the body, this invention uses cameras or other sensors to watch and analyze movements without touching the person. It processes the information in real time, so feedback can be immediate.
2. Skeleton Modeling and Feature Extraction:
The system builds a “skeleton” model from the captured images, identifying key points (like knees, hips, feet) and measuring angles, distances, and timing. It calculates important movement features, such as step length, stance duration, and body lean. This makes the analysis much more detailed and useful than simple step counters or pedometers.
3. Walking Aid Recognition and Integration:
For patients who use crutches or other aids, the system can identify the aids in the video, track their position, and compare how the aids are used relative to the body. It can detect mistakes like placing a crutch too far forward or not in sync with the leg that needs support. This is especially important for post-surgery rehabilitation.
4. Comparison to Stored Patterns with Automated Error Detection:
The system compares the patient’s movements to stored “correct” patterns, which can be tailored to the patient’s condition or stage of recovery. Deviations are detected automatically, and feedback is given right away.
5. Prioritization and Custom Feedback:
Not all mistakes are equally important. The system assigns priority levels to different types of errors. It only notifies the patient about the most urgent corrections, reducing confusion and helping them focus on one thing at a time.
6. Learning and Adaptation:
The system stores data from each session and can learn over time. If therapists make adjustments to the training plan based on the robot’s feedback, the system can remember these changes. Using machine learning, it suggests better training plans in the future or even adapts automatically. It can also improve its error detection and feedback rules as more labeled data (from therapists or patients) becomes available.
7. Flexible and Secure Data Handling:
The system supports anonymous data handling, so personal details are kept private. It can use wireless connections or local storage (like RFID tags or USB sticks) to transfer training plans and results between the robot, the cloud, and therapists’ terminals. Video data can be anonymized by blurring faces, keeping privacy intact.
System Architecture:
The invention describes a layered system design:
– The hardware layer includes the robot’s physical components: cameras, sensors, wheels, energy source, display, speakers, microphones, and communication modules (like WiFi or RFID).
– The software layers manage the robot’s skills (like navigation and person tracking), behaviors (like guiding or correcting a patient), and applications (such as gait training plans and progress evaluation).
– The system is connected to a cloud, where training plans, patient data, and learning algorithms are managed and updated. This allows therapists to access, review, and adjust patient plans from anywhere.
– The robot can guide patients to and from training areas, lead them through exercises, and stay at an optimal distance for best sensor coverage.
Practical Use Cases:
– Gait Training: After hip or knee surgery, patients need to learn to walk correctly, sometimes with crutches. The system can guide them through three-point and two-point gait patterns, ensuring they use aids correctly and progress at a safe pace.
– Therapist Support: The robot frees up therapists’ time by handling routine training and feedback, while still allowing for human review and plan adjustment.
– Standardized Assessment: By comparing patient movements to set standards, the system helps make rehabilitation more consistent and fair.
– Patient Monitoring and Safety: The robot can detect if a patient is at risk of falling, moving unsafely, or not following the plan, and can alert staff as needed.
– Data-Driven Improvement: By collecting and analyzing movement data over time, the system can suggest improved training plans or highlight patients who need extra help.
Why This Matters:
This invention creates a bridge between high-tech motion analysis (once only available in special labs) and everyday patient care. It makes detailed, real-time movement coaching available in busy clinics, hospitals, and rehab centers. It helps patients get better, faster, and with less risk of complications. It helps healthcare providers keep accurate records and standardize care. And as the system learns from more patients and therapists, it gets even better at helping people move well.
Conclusion
The patent application we dissected describes a powerful new tool for rehabilitation and patient care. By combining non-contact sensor technology, smart software, and real-time feedback, it brings expert-level movement analysis to everyday settings. Patients get more chances to practice and improve, therapists can focus on the most complex cases, and hospitals can deliver better care with fewer resources.
This system stands out because it does much more than just watch; it understands, guides, and learns. It recognizes walking aids, adapts to each patient’s needs, and keeps getting smarter over time. For anyone working in healthcare, rehabilitation, or assistive robotics, this invention offers a glimpse of a future where every patient gets the right guidance, right when they need it—and every step toward recovery is tracked, supported, and improved.
Click here https://ppubs.uspto.gov/pubwebapp/ and search 20250218221.