Base de dados : MEDLINE
Pesquisa : F01.145.209.530.538.445 [Categoria DeCS]
Referências encontradas : 1978 [refinar]
Mostrando: 1 .. 10   no formato [Detalhado]

página 1 de 198 ir para página                         

  1 / 1978 MEDLINE  
              next record last record
seleciona
para imprimir
Fotocópia
Texto completo
[PMID]:29390725
[Au] Autor:Zhao Z; Luo H; Song GH; Chen Z; Lu ZM; Wu X
[Ad] Endereço:School of Electronics and Information, Zhejiang University of Media and Communications, Hangzhou 310018, China.
[Ti] Título:Web-based interactive drone control using hand gesture.
[So] Source:Rev Sci Instrum;89(1):014707, 2018 Jan.
[Is] ISSN:1089-7623
[Cp] País de publicação:United States
[La] Idioma:eng
[Ab] Resumo:This paper develops a drone control prototype based on web technology with the aid of hand gesture. The uplink control command and downlink data (e.g., video) are transmitted by WiFi communication, and all the information exchange is realized on web. The control command is translated from various predetermined hand gestures. Specifically, the hardware of this friendly interactive control system is composed by a quadrotor drone, a computer vision-based hand gesture sensor, and a cost-effective computer. The software is simplified as a web-based user interface program. Aided by natural hand gestures, this system significantly reduces the complexity of traditional human-computer interaction, making remote drone operation more intuitive. Meanwhile, a web-based automatic control mode is provided in addition to the hand gesture control mode. For both operation modes, no extra application program is needed to be installed on the computer. Experimental results demonstrate the effectiveness and efficiency of the proposed system, including control accuracy, operation latency, etc. This system can be used in many applications such as controlling a drone in global positioning system denied environment or by handlers without professional drone control knowledge since it is easy to get started.
[Mh] Termos MeSH primário: Gestos
Telemetria
Interface Usuário-Computador
[Mh] Termos MeSH secundário: Seres Humanos
Internet
Software
[Pt] Tipo de publicação:JOURNAL ARTICLE
[Em] Mês de entrada:1802
[Cu] Atualização por classe:180208
[Lr] Data última revisão:
180208
[Sb] Subgrupo de revista:IM
[Da] Data de entrada para processamento:180203
[St] Status:MEDLINE
[do] DOI:10.1063/1.5004004


  2 / 1978 MEDLINE  
              first record previous record next record last record
seleciona
para imprimir
Fotocópia
Texto completo
[PMID]:27771538
[Au] Autor:Carrigan EM; Coppola M
[Ad] Endereço:University of Connecticut, United States. Electronic address: emily.carrigan@uconn.edu.
[Ti] Título:Successful communication does not drive language development: Evidence from adult homesign.
[So] Source:Cognition;158:10-27, 2017 01.
[Is] ISSN:1873-7838
[Cp] País de publicação:Netherlands
[La] Idioma:eng
[Ab] Resumo:Constructivist accounts of language acquisition maintain that the language learner aims to match a target provided by mature users. Communicative problem solving in the context of social interaction and matching a linguistic target or model are presented as primary mechanisms driving the language development process. However, research on the development of homesign gesture systems by deaf individuals who have no access to a linguistic model suggests that aspects of language can develop even when typical input is unavailable. In four studies, we examined the role of communication in the genesis of homesign systems by assessing how well homesigners' family members comprehend homesign productions. In Study 1, homesigners' mothers showed poorer comprehension of homesign descriptions produced by their now-adult deaf child than of spoken Spanish descriptions of the same events produced by one of their adult hearing children. Study 2 found that the younger a family member was when they first interacted with their deaf relative, the better they understood the homesigner. Despite this, no family member comprehended homesign productions at levels that would be expected if family members co-generated homesign systems with their deaf relative via communicative interactions. Study 3 found that mothers' poor or incomplete comprehension of homesign was not a result of incomplete homesign descriptions. In Study 4 we demonstrated that Deaf native users of American Sign Language, who had no previous experience with the homesigners or their homesign systems, nevertheless comprehended homesign productions out of context better than the homesigners' mothers. This suggests that homesign has comprehensible structure, to which mothers and other family members are not fully sensitive. Taken together, these studies show that communicative problem solving is not responsible for the development of structure in homesign systems. The role of this mechanism must therefore be re-evaluated in constructivist theories of language development.
[Mh] Termos MeSH primário: Compreensão
Surdez/psicologia
Desenvolvimento da Linguagem
Linguística
Linguagem de Sinais
[Mh] Termos MeSH secundário: Adolescente
Adulto
Feminino
Gestos
Seres Humanos
Masculino
Meia-Idade
Mães/psicologia
Irmãos/psicologia
Adulto Jovem
[Pt] Tipo de publicação:JOURNAL ARTICLE; RESEARCH SUPPORT, N.I.H., EXTRAMURAL; RESEARCH SUPPORT, U.S. GOV'T, NON-P.H.S.; RESEARCH SUPPORT, NON-U.S. GOV'T
[Em] Mês de entrada:1709
[Cu] Atualização por classe:180207
[Lr] Data última revisão:
180207
[Sb] Subgrupo de revista:IM
[Da] Data de entrada para processamento:161025
[St] Status:MEDLINE


  3 / 1978 MEDLINE  
              first record previous record next record last record
seleciona
para imprimir
Fotocópia
[PMID]:27776421
[Au] Autor:Schäfer MC; Sutherland D; McLay L; Achmadi D; van der Meer L; Sigafoos J; Lancioni GE; O'Reilly MF; Schlosser RW; Marschik PB
[Ad] Endereço:a New Zealand Institute of Language, Brain and Behaviour and School of Health Sciences, University of Canterbury , Christchurch , New Zealand.
[Ti] Título:Research note: attitudes of teachers and undergraduate students regarding three augmentative and alternative communication modalities.
[So] Source:Augment Altern Commun;32(4):312-319, 2016 Dec.
[Is] ISSN:1477-3848
[Cp] País de publicação:England
[La] Idioma:eng
[Ab] Resumo:The social validity of different communication modalities is a potentially important variable to consider when designing augmentative and alternative communication (AAC) interventions. To assess the social validity of three AAC modes (i.e., manual signing, picture exchange, and an iPad -based speech-generating device), we asked 59 undergraduate students (pre-service teachers) and 43 teachers to watch a video explaining each mode. They were then asked to nominate the mode they perceived to be easiest to learn as well as the most intelligible, effective, and preferred. Participants were also asked to list the main reasons for their nominations and report on their experience with each modality. Most participants (68-86%) nominated the iPad-based speech-generating device (SGD) as easiest to learn, as well as the most intelligible, effective, and preferred. This device was perceived to be easy to understand and use and to have familiar and socially acceptable technology. Results suggest that iPad-based SGDs were perceived as more socially valid among this sample of teachers and undergraduate students. Information of this type may have some relevance to designing AAC supports for people who use AAC and their current and future potential communication partners.
[Mh] Termos MeSH primário: Atitude Frente à Saúde
Auxiliares de Comunicação para Pessoas com Deficiência
Transtornos da Comunicação/reabilitação
Professores Escolares
Estudantes
Capacitação de Professores
[Mh] Termos MeSH secundário: Computadores de Mão
Gestos
Seres Humanos
Relações Interpessoais
Pesquisa Qualitativa
Inquéritos e Questionários
Universidades
[Pt] Tipo de publicação:JOURNAL ARTICLE
[Em] Mês de entrada:1801
[Cu] Atualização por classe:180117
[Lr] Data última revisão:
180117
[Sb] Subgrupo de revista:IM
[Da] Data de entrada para processamento:161026
[St] Status:MEDLINE


  4 / 1978 MEDLINE  
              first record previous record next record last record
seleciona
para imprimir
Fotocópia
[PMID]:29243451
[Au] Autor:Paananen J
[Ti] Título:Gestures facilitate interaction in multicultural primary care consultations.
[So] Source:Duodecim;133(7):653-9, 2017.
[Is] ISSN:0012-7183
[Cp] País de publicação:Finland
[La] Idioma:eng
[Ab] Resumo:Using gestures simultaneously with speech prevents and solves problems of understanding in consultations where the doctor and the patient do not share the same cultural and linguistic background. Gestures illustrate what is being said, and highlight the essential information. As gestures are used globally, and can also be interpreted despite limited vocabulary, they make the speech easier to follow. In primary care consultations, the topics are easily expressed by gestures, as they are often related to physical matters such as body parts, symptoms, examinations and treatment. Furthermore, gestural mimicry conveys empathy and willingness to collaborate.
[Mh] Termos MeSH primário: Competência Cultural
Gestos
Relações Médico-Paciente
Atenção Primária à Saúde
[Mh] Termos MeSH secundário: Seres Humanos
[Pt] Tipo de publicação:JOURNAL ARTICLE; REVIEW
[Em] Mês de entrada:1801
[Cu] Atualização por classe:180115
[Lr] Data última revisão:
180115
[Sb] Subgrupo de revista:IM
[Da] Data de entrada para processamento:171216
[St] Status:MEDLINE


  5 / 1978 MEDLINE  
              first record previous record next record last record
seleciona
para imprimir
Fotocópia
Texto completo
[PMID]:28750071
[Au] Autor:Lemaitre G; Scurto H; Françoise J; Bevilacqua F; Houix O; Susini P
[Ad] Endereço:Equipe Perception et Design Sonores, STMS-IRCAM-CNRS-UPMC, Institut de Recherche et de Coordination Acoustique Musique, Paris, France.
[Ti] Título:Rising tones and rustling noises: Metaphors in gestural depictions of sounds.
[So] Source:PLoS One;12(7):e0181786, 2017.
[Is] ISSN:1932-6203
[Cp] País de publicação:United States
[La] Idioma:eng
[Ab] Resumo:Communicating an auditory experience with words is a difficult task and, in consequence, people often rely on imitative non-verbal vocalizations and gestures. This work explored the combination of such vocalizations and gestures to communicate auditory sensations and representations elicited by non-vocal everyday sounds. Whereas our previous studies have analyzed vocal imitations, the present research focused on gestural depictions of sounds. To this end, two studies investigated the combination of gestures and non-verbal vocalizations. A first, observational study examined a set of vocal and gestural imitations of recordings of sounds representative of a typical everyday environment (ecological sounds) with manual annotations. A second, experimental study used non-ecological sounds whose parameters had been specifically designed to elicit the behaviors highlighted in the observational study, and used quantitative measures and inferential statistics. The results showed that these depicting gestures are based on systematic analogies between a referent sound, as interpreted by a receiver, and the visual aspects of the gestures: auditory-visual metaphors. The results also suggested a different role for vocalizations and gestures. Whereas the vocalizations reproduce all features of the referent sounds as faithfully as vocally possible, the gestures focus on one salient feature with metaphors based on auditory-visual correspondences. Both studies highlighted two metaphors consistently shared across participants: the spatial metaphor of pitch (mapping different pitches to different positions on the vertical dimension), and the rustling metaphor of random fluctuations (rapidly shaking of hands and fingers). We interpret these metaphors as the result of two kinds of representations elicited by sounds: auditory sensations (pitch and loudness) mapped to spatial position, and causal representations of the sound sources (e.g. rain drops, rustling leaves) pantomimed and embodied by the participants' gestures.
[Mh] Termos MeSH primário: Gestos
Metáfora
Som
[Mh] Termos MeSH secundário: Adolescente
Adulto
Feminino
Seres Humanos
Masculino
Meia-Idade
Nível de Percepção Sonora
Espectrografia do Som
Adulto Jovem
[Pt] Tipo de publicação:JOURNAL ARTICLE; OBSERVATIONAL STUDY
[Em] Mês de entrada:1710
[Cu] Atualização por classe:171009
[Lr] Data última revisão:
171009
[Sb] Subgrupo de revista:IM
[Da] Data de entrada para processamento:170728
[St] Status:MEDLINE
[do] DOI:10.1371/journal.pone.0181786


  6 / 1978 MEDLINE  
              first record previous record next record last record
seleciona
para imprimir
Fotocópia
Texto completo
[PMID]:28715475
[Au] Autor:Bhattacharjee D; N ND; Gupta S; Sau S; Sarkar R; Biswas A; Banerjee A; Babu D; Mehta D; Bhadra A
[Ad] Endereço:Department of Biological Sciences, Indian Institute of Science Education and Research Kolkata, Nadia, West Bengal, India.
[Ti] Título:Free-ranging dogs show age related plasticity in their ability to follow human pointing.
[So] Source:PLoS One;12(7):e0180643, 2017.
[Is] ISSN:1932-6203
[Cp] País de publicação:United States
[La] Idioma:eng
[Ab] Resumo:Differences in pet dogs' and captive wolves' ability to follow human communicative intents have led to the proposition of several hypotheses regarding the possession and development of social cognitive skills in dogs. It is possible that the social cognitive abilities of pet dogs are induced by indirect conditioning through living with humans, and studying free-ranging dogs can provide deeper insights into differentiating between innate abilities and conditioning in dogs. Free-ranging dogs are mostly scavengers, indirectly depending on humans for their sustenance. Humans can act both as food providers and as threats to these dogs, and thus understanding human gestures can be a survival need for the free-ranging dogs. We tested the responsiveness of such dogs in urban areas toward simple human pointing cues using dynamic proximal points. Our experiment showed that pups readily follow proximal pointing and exhibit weaker avoidance to humans, but stop doing so at the later stages of development. While juveniles showed frequent and prolonged gaze alternations, only adults adjusted their behaviour based on the reliability of the human experimenter after being rewarded. Thus free-ranging dogs show a tendency to respond to human pointing gestures, with a certain level of behavioural plasticity that allows learning from ontogenic experience.
[Mh] Termos MeSH primário: Envelhecimento/fisiologia
Cognição/fisiologia
Condicionamento (Psicologia)
Gestos
[Mh] Termos MeSH secundário: Animais
Sinais (Psicologia)
Cães
Seres Humanos
Lobos
[Pt] Tipo de publicação:JOURNAL ARTICLE
[Em] Mês de entrada:1709
[Cu] Atualização por classe:170926
[Lr] Data última revisão:
170926
[Sb] Subgrupo de revista:IM
[Da] Data de entrada para processamento:170718
[St] Status:MEDLINE
[do] DOI:10.1371/journal.pone.0180643


  7 / 1978 MEDLINE  
              first record previous record next record last record
seleciona
para imprimir
Fotocópia
Texto completo
[PMID]:28575084
[Au] Autor:Itaguchi Y; Yamada C; Yoshihara M; Fukuzawa K
[Ad] Endereço:Department of System Design Engineering, Keio University, Yokohama, Japan.
[Ti] Título:Writing in the air: A visualization tool for written languages.
[So] Source:PLoS One;12(6):e0178735, 2017.
[Is] ISSN:1932-6203
[Cp] País de publicação:United States
[La] Idioma:eng
[Ab] Resumo:The present study investigated interactions between cognitive processes and finger actions called "kusho," meaning "air-writing" in Japanese. Kanji-culture individuals often employ kusho behavior in which they move their fingers as a substitute for a pen to write mostly done when they are trying to recall the shape of a Kanji character or the spelling of an English word. To further examine the visualization role of kusho behavior on cognitive processing, we conducted a Kanji construction task in which a stimulus (i.e., sub-parts to be constructed) was simultaneously presented. In addition, we conducted a Kanji vocabulary test to reveal the relation between the kusho benefit and vocabulary size. The experiment provided two sets of novel findings. First, executing kusho behavior improved task performance (correct responses) as long as the participants watched their finger movements while solving the task. This result supports the idea that visual feedback of kusho behavior helps cognitive processing for the task. Second, task performance was positively correlated with the vocabulary score when stimuli were presented for a relatively long time, whereas the kusho benefits and vocabulary score were not correlated regardless of stimulus-presentation time. These results imply that a longer stimulus-presentation could allow participants to utilize their lexical resources for solving the task. The current findings together support the visualization role of kusho behavior, adding experimental evidence supporting the view that there are interactions between cognition and motor behavior.
[Mh] Termos MeSH primário: Dedos
Gestos
Escrita Manual
Imaginação/fisiologia
Linguagem
Desempenho Psicomotor/fisiologia
Processamento Espacial/fisiologia
[Mh] Termos MeSH secundário: Cultura
Feminino
Seres Humanos
Testes de Linguagem
Masculino
Modelos Psicológicos
Leitura
Adulto Jovem
[Pt] Tipo de publicação:JOURNAL ARTICLE
[Em] Mês de entrada:1709
[Cu] Atualização por classe:170911
[Lr] Data última revisão:
170911
[Sb] Subgrupo de revista:IM
[Da] Data de entrada para processamento:170603
[St] Status:MEDLINE
[do] DOI:10.1371/journal.pone.0178735


  8 / 1978 MEDLINE  
              first record previous record next record last record
seleciona
para imprimir
Fotocópia
Texto completo
[PMID]:28571889
[Au] Autor:Cadime I; Silva C; Santos S; Ribeiro I; Viana FL
[Ad] Endereço:Research Centre on Child Studies, University of Minho, Portugal. Electronic address: irenecadime@ie.uminho.pt.
[Ti] Título:The interrelatedness between infants' communicative gestures and lexicon size: A longitudinal study.
[So] Source:Infant Behav Dev;48(Pt B):88-97, 2017 08.
[Is] ISSN:1934-8800
[Cp] País de publicação:United States
[La] Idioma:eng
[Ab] Resumo:Research has shown a close relationship between gestures and language development. In this study, we investigate the cross-lagged relationships between different types of gestures and two lexicon dimensions: number of words produced and comprehended. Information about gestures and lexical development was collected from 48 typically developing infants when these were aged 0;9, 1;0 and 1;3. The European Portuguese version of the MacArthur-Bates Communicative Development Inventory: Words and Gestures (PT CDI:WG) was used. The results indicated that the total number of actions and gestures and the number of early gestures produced at 0;9 and at 1;0 year predicted the number of words comprehended three months later. Actions and gestures' predictive power of the number of words produced was limited to the 0;9-1;0 year interval. The opposite relationship was not found: word comprehension and production did not predict action and gestures three months later. These results highlight the importance of non-verbal communicative behavior in language development.
[Mh] Termos MeSH primário: Desenvolvimento Infantil
Gestos
Desenvolvimento da Linguagem
[Mh] Termos MeSH secundário: Compreensão
Feminino
Seres Humanos
Lactente
Transtornos do Desenvolvimento da Linguagem
Estudos Longitudinais
Masculino
Inquéritos e Questionários
Vocabulário
[Pt] Tipo de publicação:JOURNAL ARTICLE; RESEARCH SUPPORT, NON-U.S. GOV'T
[Em] Mês de entrada:1710
[Cu] Atualização por classe:171126
[Lr] Data última revisão:
171126
[Sb] Subgrupo de revista:IM
[Da] Data de entrada para processamento:170603
[St] Status:MEDLINE


  9 / 1978 MEDLINE  
              first record previous record next record last record
seleciona
para imprimir
Fotocópia
Texto completo
[PMID]:28535366
[Au] Autor:Hogrefe K; Ziegler W; Weidinger N; Goldenberg G
[Ad] Endereço:Clinical Neuropsychology Research Group (EKN), Institute of Phonetics and Speech Processing, Ludwig-Maximilians-Universität München, Munich, Germany. Electronic address: K.Hogrefe@ekn-muenchen.de.
[Ti] Título:Comprehensibility and neural substrate of communicative gestures in severe aphasia.
[So] Source:Brain Lang;171:62-71, 2017 Aug.
[Is] ISSN:1090-2155
[Cp] País de publicação:Netherlands
[La] Idioma:eng
[Ab] Resumo:Communicative gestures can compensate incomprehensibility of oral speech in severe aphasia, but the brain damage that causes aphasia may also have an impact on the production of gestures. We compared the comprehensibility of gestural communication of persons with severe aphasia and non-aphasic persons and used voxel based lesion symptom mapping (VLSM) to determine lesion sites that are responsible for poor gestural expression in aphasia. On group level, persons with aphasia conveyed more information via gestures than controls indicating a compensatory use of gestures in persons with severe aphasia. However, individual analysis showed a broad range of gestural comprehensibility. VLSM suggested that poor gestural expression was associated with lesions in anterior temporal and inferior frontal regions. We hypothesize that likely functional correlates of these localizations are selection of and flexible changes between communication channels as well as between different types of gestures and between features of actions and objects that are expressed by gestures.
[Mh] Termos MeSH primário: Afasia/patologia
Afasia/fisiopatologia
Comunicação
Compreensão/fisiologia
Gestos
[Mh] Termos MeSH secundário: Adulto
Idoso
Afasia/complicações
Lesões Encefálicas/complicações
Lesões Encefálicas/patologia
Lesões Encefálicas/fisiopatologia
Feminino
Lobo Frontal/patologia
Lobo Frontal/fisiopatologia
Seres Humanos
Masculino
Meia-Idade
Lobo Temporal/patologia
Lobo Temporal/fisiopatologia
[Pt] Tipo de publicação:JOURNAL ARTICLE
[Em] Mês de entrada:1710
[Cu] Atualização por classe:171030
[Lr] Data última revisão:
171030
[Sb] Subgrupo de revista:IM
[Da] Data de entrada para processamento:170524
[St] Status:MEDLINE


  10 / 1978 MEDLINE  
              first record previous record
seleciona
para imprimir
Fotocópia
Texto completo
[PMID]:28448161
[Au] Autor:Cartwright E; Clegg AL
[Ad] Endereço:a Department of Anthropology , Idaho State University , Pocatello , Idaho , USA.
[Ti] Título:Peaches for Lunch: Creating and Using Visual Variables.
[So] Source:Med Anthropol;36(6):519-532, 2017 Aug-Sep.
[Is] ISSN:1545-5882
[Cp] País de publicação:United States
[La] Idioma:eng
[Ab] Resumo:In this article, I describe the process of systematically including nonverbal data in medical anthropology research. I demonstrate the process of visualizing and coding videotaped moments of life and show how we can analyze what is being done along with what is being said. I ground my discussion in toddler language socialization and then expand my observations to the realm of language pathologies. Aphasia from strokes, speech difficulties in neurologically based illnesses like Lou Gehrig's disease, and the variety of communication challenges that face those on the autism spectrum can all be studied in interesting ways by including precise descriptions of nonverbal actions. I discuss the process of recording and coding the data with the software Observer XT 11.5 by Noldus. This method of collecting and analyzing video data can be used for many anthropological questions, in addition to those concerned with communication.
[Mh] Termos MeSH primário: Antropologia Médica/métodos
Comunicação
Socialização
Gravação em Vídeo
[Mh] Termos MeSH secundário: Adulto
Afasia
Transtorno Autístico
Feminino
Fixação Ocular
Gestos
Seres Humanos
Lactente
Almoço
Masculino
[Pt] Tipo de publicação:JOURNAL ARTICLE
[Em] Mês de entrada:1709
[Cu] Atualização por classe:170915
[Lr] Data última revisão:
170915
[Sb] Subgrupo de revista:IM
[Da] Data de entrada para processamento:170428
[St] Status:MEDLINE
[do] DOI:10.1080/01459740.2017.1321643



página 1 de 198 ir para página                         
   


Refinar a pesquisa
  Base de dados : MEDLINE Formulário avançado   

    Pesquisar no campo  
1  
2
3
 
           



Search engine: iAH v2.6 powered by WWWISIS

BIREME/OPAS/OMS - Centro Latino-Americano e do Caribe de Informação em Ciências da Saúde