Repository logo

Infoscience

  • English
  • French
Log In
Logo EPFL, École polytechnique fédérale de Lausanne

Infoscience

  • English
  • French
Log In
  1. Home
  2. Academic and Research Output
  3. Conferences, Workshops, Symposiums, and Seminars
  4. Fast Texture Segmentation Model based on the Shape Operator and Active Contour
 
conference paper

Fast Texture Segmentation Model based on the Shape Operator and Active Contour

Houhou, Nawal
•
Thiran, Jean-Philippe  
•
Bresson, Xavier  
2008
Computer Vision and Pattern Recognition
Computer Vision and Pattern Recognition

We present an approach for unsupervised segmentation of natural and textural images based on active contour, differential geometry and information theoretical concept. More precisely, we propose a new texture descriptor which intrinsically defines the geometry of textural regions using the shape operator borrowed from differential geometry. Then, we use the popular Kullback-Leibler distance to define an active contour model which distinguishes the background and textural objects of interest represented by the probability density functions of our new texture descriptor. We prove the existence of a solution to the proposed segmentation model. Finally, a fast and easy to implement texture segmentation algorithm is introduced to extract meaningful objects. We present promising synthetic and real-world results and compare our algorithm to other state-of-the-art techniques.

  • Files
  • Details
  • Metrics
Loading...
Thumbnail Image
Name

PID612190.pdf

Access type

openaccess

Size

937.45 KB

Format

Adobe PDF

Checksum (MD5)

b1e18f91fcb1b87f2408c967863892b4

Logo EPFL, École polytechnique fédérale de Lausanne
  • Contact
  • infoscience@epfl.ch

  • Follow us on Facebook
  • Follow us on Instagram
  • Follow us on LinkedIn
  • Follow us on X
  • Follow us on Youtube
AccessibilityLegal noticePrivacy policyCookie settingsEnd User AgreementGet helpFeedback

Infoscience is a service managed and provided by the Library and IT Services of EPFL. © EPFL, tous droits réservés