Skip to main content

Harnessing video technology and paper microfluidics to measure protein binding affinities

You are here

Home > News > Harnessing Video Technology and Paper Microfluidics Measure Protein Binding Affinities

Measuring the binding affinity between the building block of life - proteins - lies at the heart of biochemistry, disease diagnostics, and drug discovery. However, despite widespread demand and potential applications, common methods rely on expensive laboratory-based equipment, such as surface plasmon resonance and interferometry.

New research from Professor McKendry’s team at UCL suggests an approach that could be up to 500-fold cheaper to perform, using simply a digital camera, paper microfluidics, and computer vision algorithms to quantify protein binding constants and kinetic rates of reaction.

“We are marrying together consumer electronic video imaging with one of humankind’s oldest materials – paper – to create a powerful ultra low cost platform technology for measuring biochemical binding constants,” says Professor of Biomedical Nanotechnology at UCL and Director of i-sense, Rachel McKendry.

Protein binding is a key factor in determining how well a diagnostic test or a drug performs.

“The result of our research is a platform technology suitable for low-cost biophysical analysis in resource-limited settings,” says first author and PhD student from the i-sense McKendry group at UCL, Ben Miller.

“The work has the potential to widen access to high-throughput protein binding measurements with wide-ranging applications, such as drug discovery, where low cost and low reagent quantities [64-fold lower than interferometry] are important.”

What does the test involve?

The approach is a simple set-up consisting of a digital camera or smartphone, a series of microfluidic paper analytical devices (µPADs) that work similarly to a pregnancy test, and a 96-well plate.

Antigens are immobilised in a line on nitrocellulose paper strips. When the antibody-functionalised gold nanoparticles (Ab-AuNP) flow along the strip, they bind to the test line (Scheme a) creating a red line. The more Ab-AuNPs bound, the stronger the intensity of the red line will be, allowing binding to be tracked over time.

The strips are dipped into a 96-well plate, where each well contains a different concentration of Ab-AuNP solution (Scheme b). The experiment is then videoed using a digital camera (Scheme c) or smartphone.

Video analysis (Wolfram Mathematica) quantifies the development of the test line over time (Scheme d). Images are analysed by considering the pixel value and time, and the process is fitted to the Langmuir model.  

A new generation of smartphone-based biosensors

"This new methodology not only represents a novel tool to characterise protein binding interactions important for basic science studies and sensor development, but also transforms paper-based tests into a fully quantitative technology, which takes advantage of real-time kinetic measurements,” says Dr Claudio Parolo who was a Postdoctoral Research Associate in the McKendry group at UCL and is now at the University of California Santa Barbara.

The approach is part of a new generation of smartphone-connected biosensors that is emerging with significant technology developments. Such technologies are cheaper to manufacture, require less training to use, are more easily transported, and do not require complex lab equipment, making them suitable for resource limited and remote settings.

“i-sense tools and technologies are at the heart of the emerging field of mobile health and aim to find ways to make testing, and treating infectious diseases faster, easier, and more cost-effective,” says Professor Rachel McKendry.

“In this paper, we have demonstrated the proof of concept and can measure multiple antibody-antigen interactions simultaneously. In future, this could be applied to a wide range of biological and chemical binding interactions with potential applications in disease diagnostics, monitoring, drug discovery, forensics and environmental analysis.“

Useful links

Image credit