Repository logo
 
Publication

Uncalibrated eye-to-hand visual servoing using inverse fuzzy models

dc.contributor.authorGonçalves, Paulo
dc.contributor.authorMendonça, L.F.
dc.contributor.authorSousa, João M.C.
dc.contributor.authorPinto, J.R. Caldas
dc.date.accessioned2017-03-27T16:37:11Z
dc.date.available2017-03-27T16:37:11Z
dc.date.issued2008
dc.description(c) 2007 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other users, including reprinting/ republishing this material for advertising or promotional purposes, creating new collective works for resale or redistribution to servers or lists, or reuse of any copyrighted components of this work in other works.pt_PT
dc.description.abstractA new uncalibrated eye-to-hand visual servoing based on inverse fuzzy modeling is proposed in this paper. In classical visual servoing, the Jacobian plays a decisive role in the convergence of the controller, as its analytical model depends on the selected image features. This Jacobian must also be inverted online. Fuzzy modeling is applied to obtain an inverse model of the mapping between image feature variations and joint velocities. This approach is independent from the robot's kinematic model or camera calibration and also avoids the necessity of inverting the Jacobian online. An inverse model is identified for the robot workspace, using measurement data of a robotic manipulator. This inverse model is directly used as a controller. The inverse fuzzy control scheme is applied to a robotic manipulator performing visual servoing for random positioning in the robot workspace. The obtained experimental results show the effectiveness of the proposed control scheme. The fuzzy controller can position the robotic manipulator at any point in the workspace with better accuracy than the classic visual servoing approach.pt_PT
dc.description.versioninfo:eu-repo/semantics/publishedVersionpt_PT
dc.identifier.citationGONÇALVES, P.J.S. [et al.] (2008) - Uncalibrated eye-to-hand visual servoing using inverse fuzzy models. IEEE Transactions on Fuzzy Systems. ISSN 1063-6706.16 (2) 341-353. Doi 10.1109/TFUZZ.2007.896226pt_PT
dc.identifier.doi10.1109/TFUZZ.2007.896226pt_PT
dc.identifier.issn1063-6706
dc.identifier.urihttp://hdl.handle.net/10400.11/5485
dc.language.isoengpt_PT
dc.peerreviewedyespt_PT
dc.publisherIEEEpt_PT
dc.relationPrograma de Financiamento Plurianual de Unidades de I&D (POCTI) do Quadro Comunitário de Apoio III, by program FEDERpt_PT
dc.relationPOCTI/EME/39946/ 2001 - VISUAL SERVOING SYSTEMS APPLIED TO RIGID AND FLEXIBLE ROBOT MANIPULATORSpt_PT
dc.relationPrograma do FSE-UE, PRODEP III, acção 5.3, no âmbito do III Quadro Comunitário de apoiopt_PT
dc.relation.publisherversionhttp://ieeexplore.ieee.org/xpl/articleDetails.jsp?arnumber=4374115pt_PT
dc.rights.urihttp://creativecommons.org/licenses/by/4.0/pt_PT
dc.subjectFuzzy controlpt_PT
dc.subjectFuzzy set theorypt_PT
dc.subjectManipulator kinematicspt_PT
dc.subjectRobot visionpt_PT
dc.subjectVisual servoingpt_PT
dc.subjectFuzzy modelingpt_PT
dc.subjectInverse fuzzy controlpt_PT
dc.titleUncalibrated eye-to-hand visual servoing using inverse fuzzy modelspt_PT
dc.typejournal article
dspace.entity.typePublication
oaire.citation.endPage353pt_PT
oaire.citation.startPage341pt_PT
oaire.citation.titleIEEE Transactions on Fuzzy Systemspt_PT
oaire.citation.volume16pt_PT
person.familyNameGonçalves
person.givenNamePaulo
person.identifier.ciencia-id2816-A2FA-C5A3
person.identifier.orcid0000-0002-8692-7338
person.identifier.ridE-5640-2012
person.identifier.scopus-author-id35853838000
rcaap.rightsopenAccesspt_PT
rcaap.typearticlept_PT
relation.isAuthorOfPublication86a6a234-d690-4c2b-8bee-d58005eebba2
relation.isAuthorOfPublication.latestForDiscovery86a6a234-d690-4c2b-8bee-d58005eebba2

Files

Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
2008_tfs.pdf
Size:
85.85 KB
Format:
Adobe Portable Document Format
License bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
2.02 KB
Format:
Item-specific license agreed upon to submission
Description: