Decision Support Systems 43 (2007) 607 – 617 www.elsevier.com/locate/dss

Trust and technologies: Implications for organizational work practices Melanie J. Ashleigh a,*, Joe Nandhakumar b a

School of Management, University of Southampton, Southampton, SO17 1BJ England, UK b School of Management, University of Bath, Bath, BA2 7AY, England, UK Available online 14 July 2005

Abstract In this paper, we empirically investigate the concept of trust across organizational work practices by examining three groups: within the team, between teams and when interacting with technology. This study adopts Repertory Grid methodology as an interview based technique to elicit important constructs of trust to engineering teams working in two organizations within the energy distribution industry. Thirteen key constructs of trust were identified using content analysis. Drawing on the understanding gained, this paper discusses the implications for theories on trust within teams working with technology across organizations and provides a grounded perspective that could be used as a basis for further research. D 2005 Elsevier B.V. All rights reserved. Keywords: Trust constructs; Technology interaction; Teams; Repertory grid methodology; Process control

1. Introduction It is wildly acknowledged that the presence of btrustQ is one of the main conditions for effective cooperation among individuals, groups and organizations [18,33,25,9]. The recent advances in information technologies have helped organizations to apply such technologies in innovative ways for supporting collaborative work practices [34]. Such work practices represent a complex blend of human actors and technological systems, where individuals can accomplish * Corresponding author. Tel.: +44 23 80594738; fax: +44 23 80593844. E-mail address: [email protected] (M.J. Ashleigh). 0167-9236/$ - see front matter D 2005 Elsevier B.V. All rights reserved. doi:10.1016/j.dss.2005.05.018

tasks and interaction through technological systems that they could not otherwise achieve. It is not clear how organizational members in such complex settings conceptualise trust and with what or whom they can meaningfully speak of building trust relationships. In an attempt to increase inter-organizational alliances and competitive advantage, advances in technology and innovation in virtual partnerships have exacerbated the need for collaboration across inter-organizational working practices [32]. Consequently, the need for trust has become a basic ingredient for inter-organizational success [11,16]. This study aims to gain a richer understanding of how organizational members in volatile organizational settings conceptualised trust. Repertory Grid method-

608

M.J. Ashleigh, J. Nandhakumar / Decision Support Systems 43 (2007) 607–617

ology [17] was used as an interview based technique to elicit constructs of trust in two organizations. Thirteen core constructs were elicited. Differences were found; both within and between the three groups according to participants’ scored level of trust for each group. The understanding gained is used to outline implications for theories on trust within and between teams working and with technological systems across organizations.

2. The concept of trust in organizations From an inter-organizational perspective, the concept of trust underlies the sharing and transference of vital knowledge and is particularly significant in the information systems environment [30]. While new technological systems are geared towards supporting workers in improving task performance and social networks, research is now concerned with why workers are being hindered by such systems and why they are not motivated to use them [12]. It has been reported that such systems not only increase workload but studies suggest that reticence in using such technology is due to a general lack of trust in both the systems and the people designing them [35]. Although the primary focus of this research is concerned with trust at the team level and between humans and technology, we argue that it is also applicable in gaining a greater understanding of trust within technological environments such as inter-organizational partnerships and/or virtual teams and therefore relates to current business processes. Generally, the concept of trust in organisations has gained increasing attention from management researchers (e.g. [22,18,33]) and computing human factors researchers (e.g. [8,31]). Trust is often seen by researchers as the most difficult concept to handle in empirical research because of the diverse definitions used in each discipline and the multitude of functions it performs in society [26]. Rousseau et al. [33] therefore claim that there is bno universally accepted scholarly definition of trustQ. Many researchers perceive trust in terms of individuals’ expression of confidence in others’ intention and motives (e.g. [5]). This viewpoint attributes trust to an interpersonal relationship, as Friedman et al. [8] claim: bPeople trust people, not technologyQ (p. 36).

More recently, however, researchers perceive trust in terms of optimistic expectation of behaviour of another (e.g. [22]). Lewis and Weigert [20] distinguish three dimensions of trust: cognitive, emotional and behavioural. Here trust is not seen as an individual attribute but as a collective attribute. The emotional process is when an affective state exists—an emotional bond forms in the relationship between individuals or groups of people, all of which are underwritten by behaviour. Lewis and Weigert [20] however do not offer any explanation for the social mechanisms that can help develop or impede trust. This theoretical model is useful however in trying to understand how trust is perceived within and across organisations and was used as the theoretical underpinning of Cummings and Bromiley’s [3] Organisational Trust Inventory. It employs the same three dimensions in developing a matrix against three elements of trust as a belief system. By believing in a shared common goal, they maintain that group action should be based on: good-faith efforts to behave in accordance with implicit and explicit commitments, honesty in negotiations preceding those commitments and not taking advantage of another person even when opportunity presents itself. The study reported in this paper adopts this general model as a basis for the exploration of trust in organizational work practices involving teams and technological systems. Trust within and between teams is a much more complex phenomenon as teams involve multiple, interdependent actors. It is precisely because of this interdependency, however, that necessitates some element of trust being present in order for its effective functioning [15]. Growing research devoted to teamwork refers to factors such as cohesiveness, cooperation, coordination and effective communication processes as being some of the most important issues in achieving team effectiveness (e.g. [23]). These interdependent factors have been found to enormously affect team decision making both directly and indirectly. While information technology is increasingly used to mediate teamworking, Stanton and Ashleigh [36] argue that team members are often reluctant to trust technology until they have gained experience from using a system and have had positive meaningful feedback from it. Technology is not value neutral but provides a form of dfittingnessT and reliability that

M.J. Ashleigh, J. Nandhakumar / Decision Support Systems 43 (2007) 607–617

follows from features of technology [8]. In a study researching user acceptance of information technology, Davis [4] found that perceived usefulness of a system (i.e. does it perform the task) was 50% more influential than the ease of use of the system in determining how much the system was actually used. This research emphasises the importance of designing new systems with appropriate functional capabilities to suit user expectations and how operators will adjust to the different functions of future technology. Muir [28] found that operators’ perception of trust was only changed by the performance of the machine and people quickly reverted to manual operations when they felt technology was unreliable. When controlling any dynamic system (e.g. energy distribution system as in the case for this paper), team members are constantly reliant on technology as well as having to multi-task over a period of time. This necessitates greater interdependency both within and between teams as well as between technology and its human interface. Some would argue against the possibility of any kind of trust being developed across virtual teams or organizations, maintaining that trust needs dtouchT [9]. Others have debated how technological environments can impair a team’s functional performance, hence inhibiting the development of trust [21]. The implication is that, in time dependent project-oriented relationships, where a lack of social interaction or a mutually shared history or embedded culture exists, this added diversity impedes the development of trust. Conversely, other authors have reported that even without social cues, a dswiftT, dtemporaryT or dabstractT T trust is possible over virtual networks [14,24,29]. However, it is acknowledged that this trust is fragile and easily diminishes unless supported by ongoing effective communication [13]. Today, in inter-organizational relationships where virtuality is becoming the norm and the majority of communication is computer mediated, it is therefore essential to develop a fuller understanding of the key elements of trust between humans and their technology as well as within and between teams.

3. Research sites Two companies within the energy distribution industry were used as research sites. The study

609

specifically focussed on 16 male control engineers, in three groups: within the team, between teams and when interacting with technology, who were interviewed from each company (2  8). All participants were either chemical or electrical engineers and had a minimum of 3 years experience working as a control room engineer. In this context, engineers continuously control a physically remote plant via bespoke systems to maintain a common goal of balancing generation of energy with demand. Working practices also included monitoring oscillating variables such as changes in flow and pressure, alarms handling, energy storage and organising the maintenance of the physical plant. Consequently, there were many layers of sub-tasks within the main function of maintaining a stable system. Engineers were therefore constantly interacting with their systems and each other and the whole socio-technical system was interdependent. Company restructuring had also affected working practices as some within intra-team members were now forced to work remotely from each other; this greatly reduced face-to-face interaction and could subsequently influence the development of trust.

4. Research approach The Repertory Grid methodology [17] was used as a field research technique to elicit important constructs of trust to team members working in energy distribution control rooms. This method was based on Personal Construct Psychology and was introduced by Kelly to explain how people conceptualise their world. It is based on the notion that humans actively generate and test their own hypothesis by constructing a personal system of constructs. They are continually forming and revising these constructs in order to understand and test these hypotheses in relation to their reality. This methodology adopts a phenomenological or bottom up approach and was therefore considered an appropriate and novel method to understand how control engineers constructed the concept of trust within their own work domain. It was also considered that this method was appropriate for this contextual domain, as it produces more meaningful information but also allows the use of a standardised scoring system. This minimises interviewer interpre-

610

M.J. Ashleigh, J. Nandhakumar / Decision Support Systems 43 (2007) 607–617

tation as participants have to score applicability of description themselves [37]. Participants volunteered to give their opinion on the concept of trust within their work context. Three sets of different elements were used to indicate three different groups: within the team, between teams and when interacting with technology. Triads of elements of either people or systems were developed in collaboration with the participants. These were taken from each group and participants were asked what important construct or characteristic made two of the elements similar but different from the third. There were eight elements in each group, and examples of elements were: dthe team member I work most withT (intra-team), dsomeone with my job on another teamT (inter-team) and dthe demand forecasting systemT (technology). This produced a positive/ negative continuum for each construct. For example, participants were asked: dwhen thinking about the concept of trust, what characteristic makes the dteam member you work most withT similar to dyour best friend on the team but different from danother engineer on your shiftT. Participants were then encouraged to think of their own construct, without any interference from the researcher. An example of a construct elicited was dunderstandingT, where this was shared by both dthe team member who was worked with mostT and their dbest friend on the teamT. The difference was that there was a lack of understanding from danother engineer on their shiftT. Each triad of elements was repeated until all combinations of eight elements had been exhausted, or

the participant had run out of characteristics to give. Participants were then asked to score each element using their own constructs along a (1–5) Likert scale, indicating the amount of trust perceived from each element. For example, if they had given the construct of dhonestyT–dnot openT as important, they then had to assign a score to each set of elements along the continuum of honesty–not open, where 5 = very honest and 1 = not at all open. Notice that constructs were not necessarily straightforward opposites at each end of the continuum (e.g. honest–dishonest) but this was left to the participants’ perception of the construct or characteristic. Neither did the researcher know who the participant was thinking of in terms of (dteam member most worked with or another engineer on their shiftT). This method therefore allowed complete anonymity of team members as well as eliminating any researcher bias.

5. Data analysis A total of 60 different elicited constructs were reduced to 13 core constructs through content analysis. Each elicited construct with the same definition (taken from the Oxford English dictionary) was categorised into a core construct (e.g. constructs such as: open, honest, truthful), were categorised under the core construct of honesty. This was done separately and by two independent researchers. Each core construct was then categorised into one of the three dimensions of emo-

Table 1 Hierarchy of perceived importance of constructs shown in percentages Importance of trust constructs within each group Trust within team

Trust between teams

Trust in technology

bHonestyQ—20% bUnderstandingQ—16% bRespectQ—13% bQuality of interactionQ and bconfidenceQ—9%

bQuality of interactionQ—16% bUnderstandingQ—12% bTeamworkQ—12% bHonestyQ and bconfidenceQ—10%

bProactivelyQ, breliabilityQ and bcommunicationQ—6% bTeamworkQ and bcommitmentQ—4% bPerformanceQ and babilityQ—3%

bCommunicationQ and breliabilityQ—8%

bQuality of interactionQ—21% bReliabilityQ—13% bPerformanceQ—11% bUnderstandingQ, bcommunicationQ, bexpectancyQ and bconfidenceQ—10% bProactivelyQ—7%

bExpectancyQ—1%

bAbilityQ—6% bCommitmentQ, brespectQ, bexpectancyQ and bperformanceQ—4% bProactivelyQ—2%

bAbilityQ—4% bRespectQ and bHonestyQ—2%

M.J. Ashleigh, J. Nandhakumar / Decision Support Systems 43 (2007) 607–617

tive, cognitive or behavioural trust, in line with the Cummings and Bromiley [3] model. For example, the constructs confidence, respect, commitment and teamwork were considered to be emotive characteristics and so were categorised under the emotive dimension and so on. The cognitive dimension included the core constructs of understanding, ability and expectancy as they were considered to be about cognitive processes and the behavioural dimension included the characteristics of honesty, reliability, proactivity, performance, communication and quality of interaction. This exercise was repeated eight times by each researcher and a Spearman rank correlation was carried out in order to test inter-rater reliability. The result r s = 0.891, n = 13 and p b 0.01 showed a highly significant correlation by calculating the frequency count of core constructs or their subordinates within each group (e.g. intra/inter-team and technology), a hierarchy of trust constructs in terms of importance was developed for the three groups. Mean participant scores for each core construct across all elements were then calculated. This gave an overall participant score for every element and an overall mean group score for each core construct. In order to compare any

611

differences between groups, Freidman ANOVA tests were carried out on the 13 core constructs. A paired Wilcoxon test was then used to identify where group differences lay. This non-parametric statistical analysis was used to compare any differences across the dimensions for the constructs of trust. This type of analysis is appropriate when data samples are small and unevenly distributed.

6. Results Table 1 shows the most important core constructs of trust found within each group (intra-team, interteam and when interacting with technology), presented in a top-down hierarchy. From the 13 cores constructs categorised, a percentage was calculated for the degree of importance within each group. As can be seen from Table 1, differences were found in importance of constructs across the three groups, although some commonality in constructs existed. Respondents perceived quality of interaction, understanding and confidence to be an important core construct across the three groups, albeit at different

5 4.5

Mean Score

4 3.5 3 2.5 2 1.5 1

Between team

Technology

Fig. 1. Bar chart showing mean scores of core constructs by groups.

Quality of Interaction

Within team

Communication

Performance

Proactive

Reliability

Honesty

Expectancy

Ability

Understanding

Teamwork

Committment

Respect

Confidence

Constructs of Trust

612

M.J. Ashleigh, J. Nandhakumar / Decision Support Systems 43 (2007) 607–617

levels of importance in the hierarchy. Quality of Interaction was felt to be the most important in respect of trusting technology. Understanding was the next common construct across the three groups, which was perceived to be more important within teams, followed by between teams and technology. Honesty was perceived as the most important construct within teams, whereas it was third down the hierarchy between teams. Constructs were more evenly distributed for the inter-team group with honesty and confidence sharing the same importance; however, honesty was not applicable to technology group. Confidence was perceived to be generally at the same level of importance across the three groups. Differences were found in terms of level of trust by the comparison of scores between groups for each core construct. Mean scores for each construct across group are presented in Fig. 1.

7. Discussion Results showed that feelings of trust in terms of confidence, respect, commitment and teamwork were significantly higher within teams than between teams, which, from a social psychological perspective [38], is what one would expect as an intra-team identity is formed over time. Team members working together within the same team are more likely to have developed higher trust through a sense of belonging when they reach this identification stage of trust [19]. A higher feeling of trust was also expected with physically co-located teams, as they have more opportunity for social interaction, exchanging non-verbal cues, sharing group norms and can therefore develop greater interdependence. Some members from the same team however were working in separate control rooms and yet intra-team trust still scored significantly higher than between team members. This seems to suggest that higher emotive trust is developed from tightly cohesive teams all working together over long periods of time with a common goal. Although confidence was considered of common importance across all three groups, analysis of scores showed that team members felt significantly less confidence in their technology. This warrants some concern as all team members had at least 3 years experience of the systems and the majority had even longer. The teamwork construct was rated favourably

for the between team group in level of importance, however was significantly lower when scored. This indicates a lower level of trust with regard to sharing the same values and goals with those members on different teams. Even though computer systems play a major role in their everyday functioning, team members appeared to have little respect for the technology, neither were they generally committed to it. If behavioural constructs such as performance or the quality of interaction were low from technology, these perceptions may have influenced people’s feelings of confidence and respect towards the technology. As Muir’s [28] established trust in automation did not increase through experience but only changed with the competence of the machine, in other words, the perceived behaviour of the technology. The construct of understanding, which included subordinates such as knowledge, experience and familiarity, was perceived to be an important construct across all groups. From mean scores and confirmatory statistical analysis, it is apparent that generally team members have a better understanding of the technology than of colleagues in other teams. This may be because they experienced more information sharing with their various systems than with intra-team members. Alternatively, the differences could simply be because these team members have very little or no physical face to face contact with inter-team members. With very limited social interaction and no traditional mechanisms in place in order to facilitate or support such interaction, there is no opportunity to build relationships; hence, a general lack of mutual understanding between teams is apparent here. Conversely, authors such as Cerulo [2] claim that, with the growth in dispersed teams, where there is an absence of physical presence, technology is forcing people to re-adjust to the concepts of social interaction. Cerulo [2] found that even when physically remote, complete strangers could exhibit personal, informal and even intimate exchanges through computer mediated communication (CMC). She maintained that, rather than physically collocated, relationships were built upon sharing the same goal or task. Similarly, Walther and Burgoon [40] found that reciprocity and trust could develop over time even when groups of students with no prior history worked together on a collaborative project using only CMC.

M.J. Ashleigh, J. Nandhakumar / Decision Support Systems 43 (2007) 607–617

This indicates that dcognitiveT trust can develop without social cues and/or familiarity, even when people are remotely working as long as they do share some commonality. In this case, the shared understanding was the joint project the students had to complete within five weeks. In any professional setting such as the control room environment in the current study, one would normally expect there to be a mutually shared objective even between teams. This expectancy comes from the nature of process control where the whole system runs on a continuous 24 h basis and relies on total interdependency between members. With the growth of inter-organizational partners, this mutuality or shared understanding is likely to become more significant in a virtual context. Unless this dcommon business understandingT [16] is nurtured early in the development of virtual partnerships or networked alliances, then trust is not likely to develop. Such an understanding may involve shared expectations about the product (or process in this case), cooperative agreements and a sense of shared identity. Hence, we would argue that, although the quality of interaction matters, it should be contextualised and aligned with the shared vision of the teams. Results from this study suggest that although people were task focussed within their individual teams, the same interteam understanding has been lost or has not been developed, creating an absence of trust between teams. Alternatively, it may be that inter-team relations were not perceived to be as trusting due to a competitive rather than cooperative ethos that still exists between teams in these work domains. Team members also showed a higher expectancy of the systems than from people in other teams. This maybe a learned response based on past experiences of not having their expectancies met or because they have less interaction with members of other teams than the technology. Muir and Moray’s [27] research found that trust and/or distrust could develop in technology, as when in constant error mode, participants learned to compensate and make adjustments. Results indicated that trust grew over time confirming that to develop trust in automation, people do need experience. The differences in ability of the technology show that it does not always meet the expectations of the team members. Results implied that team members were significantly more honest within their own teams than with

613

members of other teams. This may present cause for concern in any organisation, but particularly in an environment where interdependency with other departments including support, planning, as well as outside agents are all crucial to the success of the continuous process. Participants also perceived that they had better communication and exchange of interaction with systems than from people in other teams. As communication is the key element of cooperative teamworking, it would seem that there are some serious issues to be addressed with regard to raising the level of trust between teams. Reliability was an important construct in technology, although results did not reflect this and suggested that systems were not very reliable—similar to members of other teams. Research suggests that machine behaviour needs to be both consistent and reliable in order to foster and maintain trust in technology [27]. Perceived performance across all three groups was considered to be fairly stable, although the technology scored higher in performance than between-team members. The construct of quality of interaction was defined as the way in which people and systems interact. Although it incorporated many subordinate constructs (i.e. personable, informal, approachable, etc.), it was rated as the most important construct in the between-team and technology group. Results however did not support this, as team members viewed the quality of interaction between teams and from technology significantly less than from their own within-team. This may present immense problems in terms of designing new technology. If information is not meaningfully or adequately represented in terms of enabling better interaction, then team members will be reticent in accepting it, not be proactive in using it and hence take longer to trust it [4]. Engineers were reiterating this perception in this study. Therefore, in order to raise the level of trust in technological systems, design needs to be aware of users’ expectations and systems should be created that respond in a human centred way. In the expectation that virtual organizations and teamwork will continue to increase in the future, the need for consistency and reliability across technologies is vital. Research into building trust in interorganizational contexts has argued that lack of standardization, bandwidth and reliability in technology is associated with less effective communication,

614

M.J. Ashleigh, J. Nandhakumar / Decision Support Systems 43 (2007) 607–617

hence trustworthiness. Other problems associated with building trust using technological systems are availability, capacity and user friendliness [16]. In attempting to align the interface between human and technological systems, it is the perception of control that humans need when interacting with the technology [6,7]. The more visual feedback received from the system, the more positive the perception of control, hence trust in technology is increased. The results of this study support these ideas in that perceived trust in technology and inter-team trust may have been less because of the perceived lack of control. Had the technology been more consistent in its feedback or the inter-team members produced more continuous feedback (e.g. greater quality of interaction and/or communication) towards each other, then it is more likely that trust between teams and between human and system would have improved. This is considered an important area of future research and has strong implications for interorganizational and virtual team trust.

8. Conclusions and implications This study set out to gain a better understanding of how team members in volatile organizational settings conceptualised trust. Results have enabled a richer insight into the difficulties of articulating and measuring this complex concept. Results suggest that whilst team members emphasised the importance of trust in their work context, the amount of constructs initially elicited seemed to indicate a wide variance in the way that trust was construed or how it could be made explicit. The team members initially had difficulty talking about such emotive issues, but as each construct was evaluated, it became more apparent that they did share a commonality in their language of trust through their perceived importance of quality of interaction, understanding and confidence. In terms of the level of trust scored, however, it is suggested that they did not share feelings of confidence, respect or teamwork because of the short fall in quality of interaction, openness or the other behavioural constructs. Although researchers try to develop questionnaires to measure concepts such as trust, it is considered that the facets of trust are too ambiguous and

methodologies should not be driven by reductionist approaches. Although labour intensive, the Repertory Grid method produces data richness, which gives a more in-depth appreciation of trust. From the frequency results (in terms of importance), team members expected trust to be high in terms of confidence, understanding and quality of interaction across all three groups; however, results in terms of degree of trust scored were significantly lower than expected. This suggests that, even when team members believe a person or technological system to be trustworthy and feel some sense of belonging to another team member or their computer, trust does not actually exist without tangible evidence of this being present. In other words, the team members needed observable behaviour in order for trust to develop. Perhaps, the collective attribute that Lewis and Weigert [20] suggest does not begin with the cognitive process but is more likely to be driven by social action or the continuous feedback (quality of interaction) that should be relevant to the shared goal and be contextual to the work domain. Theoretically, an interdependent team with a shared common goal should inherently possess some degree of trust [3]. This assumes that members will be motivated or willing to take risks in the pursuit of that goal and/or from the saliency of group identity. Exhibiting actions will reinforce the cognitive element of collaborative decision-making and skill interdependency that will enhance team knowledge [1]. Furthermore, behavioural factors will reinforce a feeling of collectivism between members. When team trust is displayed explicitly this reinforces confidence within and between teams, thereby enhancing interdependency. It therefore seems that the way trust is displayed explicitly is what reinforces the other states. It is of course acknowledged that there are necessary antecedents of trust in the form of a belief system or expectancy of other members or groups that promotes the willingness to take risk in order to achieve reciprocal rewards. However, rather than values, attitudes emotions and moods that drive the team into acting [15] in a face-toface or virtual team it is considered that it is through the actions of members that will positively or negatively reinforce the ability to trust or be trustworthy. It is possible that this model can also

M.J. Ashleigh, J. Nandhakumar / Decision Support Systems 43 (2007) 607–617

be applied to information technology as it is through the competency (e.g. consistent and reliable behaviour-continuous feedback) of the technological system that ensures the growth of trust in the operator of the system. The degree of trust, although expected and seen to be important by team members within and between teams and in technology, did not exist in matching expectation when not supported by action. The insights gained from this study seem to indicate that, in order to nurture trust within or between teams or more generally any work practices within organizations, the level of awareness to take action [10] needs to be raised. Findings also suggest that the more isolated people become from each other, the less trust they perceive in each other, which emphasises the need to make trust more observable, through active responses. In some instances, the scores of trust in technological systems (e.g. performance and communication) were higher than scores for team members in other teams. Research into global teams [14] found that high trusting teams were those that exhibited high performance. They achieved this through displaying consistent proactive behaviours, giving consistent and timely feedback and constantly negotiating with each other. Even when members had difficulty with carrying out a particular task, communicating the reasons why was considered a positive act and reinforced trust. As organisations move further away from face-toface interactions and reliance on technology becomes even more ubiquitous, it is vital that systems are designed that comply with human natural abilities. Type of information and the way that it is consistently displayed through technology can also enhance or impair psychological processes, in terms of psychological remoteness, distraction and loss of situational awareness; all of which can debilitate performance. It is therefore imperative that as the workforce becomes even more remote, design technology embraces this fact, enabling human interaction and distributed decision making easier rather than harder. Commonly, more people have to communicate and interact with the world of work via machine and they therefore need to be able to trust the systems they are using. In order to do this system, interfaces must match human expectation in terms of understanding how

615

the system works and how it will help achieve better performance. Finally, the findings from this study seem relevant to wider inter-organizational settings, Particularly from the results from inter-team trust and trust between human and technology interaction several important issues have been raised. First, in order to develop trust across organisations, it is necessary to have a shared understanding in terms of common business processes and a common goal. Through such common understanding, trust is more likely to develop across inter-organizational alliances. Second, there is a need for continuous feedback, which relates to the quality of interaction between team members in virtual alliances. Such interaction is increasingly technologically mediated, therefore as team members become more remote, the need for consistent feedback from systems is important if trust between inter-organizational alliances is to develop. Third communication, even when team members are remote is vital for inter-organizational trust to thrive. This can be enhanced by utilizing strategies for better knowledge transfer and methods of communication that best fit the personnel and systems involved. Results showed that people were less able to communicate with members of other teams than the systems they used. This indicates that communication is a key factor in the development of trust and important for facilitating cohesive collaboration across inter-organizational contexts. This study has also raised many questions, which we consider need further exploration. For example how some of the trust constructs would be perceived within a different context? This study was based on process control environments which are still extremely male dominated and the shift teams interviewed were solely male. There may therefore be gender issues that could be examined. Future research could adopt a different methodology, such as an in-depth case study approach [39], to explore the dynamism of how the trust constructs may vary across time and how they may differ across different types of interteam structures (e.g. cross functional teams, ad-hoc project teams) in different organizational contexts. Case studies across a variety of industrial contexts could also be used to validate the current constructs found from this research. It is considered that as interorganizational and virtual partnerships now form a

616

M.J. Ashleigh, J. Nandhakumar / Decision Support Systems 43 (2007) 607–617

vital part of current business practice, so research into trust across such alliances will become even more significant.

Acknowledgements An earlier version of this paper was presented at the 10th European Conference on Information Systems (ECIS’2002) in Gdansk, Poland where it received the bBest Research PaperQ award. The authors would like to thank the attendees for their critical comments and suggestions. References [1] J.R. Anderson, Acquisition of cognitive skill, Psychological Review 89 (1982) 369 – 406. [2] K.A. Cerulo, Technologically generated communities, reframing sociological concepts for a brave new (virtual?) world, Sociological Inquiry 67 (1) (1997) 48 – 58. [3] L.L. Cummings, P. Bromiley, The organizational trust inventory development and validation, in: R.M. Kramer, R.R. Tyler (Eds.), Trust in Organizations: Frontiers of Theory and Research, Sage, London, 1996. [4] F.D. Davis, User acceptance of information technology: system characteristics, user perceptions and behavioural impacts, International Journal of Man-Machine Studies 38 (1993) 475 – 487. [5] M. Deutsch, Trust and suspicion, Journal of Conflict Resolution 2 (1958) 265 – 279. [6] Philip S.E. Farrell, Appendix B DVI applied to FOAS storyboard for cognitive cockpit, Defence and Civil Institute of Environmental Medicine, Ontario, Canada, 2000. [7] P.S.E. Farrell, M.A.H. Semprie, Layered protocol analysis of a control display unit, DCIEM Report, vol. 97-R-70, Department of National Defence, North York, 1997. [8] B. Friedman, P.H. Kahn, D.C. Howe, Trust online, Communications of the ACM 43 (12) (2000) 34 – 40. [9] C. Handy, Trust and the virtual organisation, Harvard Business Review 73 (3) (1995) 40 – 50. [10] G.E. Hawisher, C. Morgan, Electronic mail and the writing instructor, College English 55 (6) (1993) 627 – 643. [11] L. Hosmer, Trust: the connection link between organizational theory and philosophical ethics, Academy of Management Review 20 (1995) 379 – 403. [12] G.P. Huber, Transfer of knowledge in knowledge management systems: unexplored issues and suggested studies, European Journal of Information Systems 10 (2001) 72 – 79. [13] S.L Jarvenpaa, D.E. Leidner, Communication and trust in global virtual teams, Journal of Computer-Mediated Communication and Organisation Science: A Joint Issue 3 (1998) 1 – 38.

[14] S.L. Jarvenpaa, K. Knoll, D.E. Leidner, Is anybody out there? Antecedents of trust in global virtual teams, Journal of Management Information Systems 14 (4) (1998) 29 – 64. [15] G.R. Jones, J.M. George, The experience and evolution of trust: implications for co-operation and teamwork, Academy of Management Review 23 (3) (1998) 531 – 546. [16] E.C. Kasper-Fuehrer, N.M. Ashkanasy, Journal of Management 27 (2001) 2254 – 2325. [17] G. Kelly, The Psychology of Personal Constructs, Norton, New York, 1955. [18] R.M. Kramer, T.R. Tyler, Trust in Organizations: Frontiers of Theory and Research, Sage, Thousand Oaks, 1996. [19] R.J. Lewicki, B.B. Bunker, Developing and maintaining trust in work relationships, in: R.M. Kramer, T.R. Tyler (Eds.), Trust in Organisations, Frontiers of Theory and Research, Sage, London, 1996. [20] J.D. Lewis, A. Weigert, Trust as a social reality, Social Forces 63 (1985) 967 – 985. [21] J.E.. McGrath, Time matters in groups, in: J. Galegher, R. Kraut, C. Egido (Eds.), Intellectual Teamwork: Social and Technological Foundations of Co-operative Work, Lawrence Erlbaum, Hillsdale, N.J., 1991, pp. 23 – 61. [22] R.C. Mayer, J.H. Davis, F.D. Schoorman, An integrative model of organisational trust, Academy of Management Review 20 (1995) 709 – 734. [23] E. Mayo, The Human Problems of an Industrial Civilisation, Macmillan, New York, 1993. [24] D. Meyerson, K.E. Weick, R.M. Kramer, Swift trust and temporary groups, in: R.M. Kramer, T.T. Tyler (Eds.), Trust in Organisations; Frontiers of Theory and Research, Sage, London, 1996. [25] R.E. Miles, C.C. Snow, The new network firm: a spherical structure built on a human investment philosophy, Organizational Dynamics 5 (1995 (Spring)) 5 – 18. [26] B.A. Misztal, Trust in Modern Societies, Polity, Oxford, 1996. [27] B.M. Muir, N. Moray, Trust in Automation: Part II. Experimental studies of trust and human intervention in a process control simulation, Ergonomics 39 (3) (1996) 429 – 460. [28] B.M. Muir, Trust in automation: Part I. Theoretical issues in the study of trust and human intervention in automated systems, Ergonomics 37 (11) (1994) 1905 – 1922. [29] J. Nandhakumar, R. Baskerville, Trusting online: nurturing trust in virtual teams, in: S. Smithson, J. Gricar, M. Podlogar, S. Avgerinou (Eds.), Proceedings of the 9th European Conference on Information Systems, Bled, Slovenia, 2001, pp. 188 – 194. [30] K.M. Nelson, J.G. Cooprider, The contribution of shared knowledge to IS group performance, MIS Quarterly 51 (1996) 123 – 140. [31] J.S. Olson, G.M. Olson, i2i Trust in e-commerce, Communications of the ACM 43 (12) (2000) 41 – 44. [32] Panteli, N., Sockalingam, S., dTrust and conflict within virtual inter_organizational alliances: a framework for facilitating knowledge sharing.T Decision Support Systems (in press). [33] D.M. Rousseau, B.B. Sitkin, R.S. Burt, C. Camerer, No so different after all: a cross-disciplinary view of trust, The Academy of Management Review 23 (3) (1998) 393 – 404.

M.J. Ashleigh, J. Nandhakumar / Decision Support Systems 43 (2007) 607–617 [34] J. Slevin, The Internet and Society, Polity Press, Cambridge, 2000. [35] C. Standing, S. Benson, Knowledge management in a competitive environment, in: S.A. Carlsson, P. Brezillon, P. Humphreys, B.G. Lundbert, A.M. McCosh, V. Rajkovic (Eds.), Decision Support through Knowledge Management, Department of Computer and Systems Sciences, University of Stockholm and Royal Institute of Technology, Sweden, 2000, pp. 336 – 348. [36] N.A. Stanton, M.J. Ashleigh, A field study of teamworking in a new human supervisory control system, Ergonomics 43 (8) (2000) 1190 – 1209. [37] V. Stewart, A. Stewart, Business Applications of Repertory Grid, McGraw-Hill, London, 1981. [38] J.C. Turner, Towards a cognitive redefinition of the social group, in: H. Tajfel (Ed.), Social Identity and Intergroup Relations, Cambridge University Press, Cambridge, 1982. [39] G. Walsham, Interpreting Information Systems in Organizations, John-Wiley, Chichester, 1993. [40] J.B. Walther, J.K. Burgoon, Relational communication in computer-mediated interaction, Human Communication Research 19 (1) (1992) 50 – 88. Melanie Ashleigh is a Lecturer in Psychology, HRM and Information Systems within the School of Management at the University of Southampton, England. She achieved her PhD from the School of Engineering, University of Southampton in 2002. Her research interests include trust in teams and technology, knowledge management, human–computer interaction and information systems. She has published her research in such journals as Journal of Management in Engineering and Ergonomics.

617

Joe Nandhakumar is a Reader (Associate Professor) in Information Systems and the Director of the Centre for Information Management in the School of Management at the University of Bath, England. He has wide-ranging experience in industry and gained his PhD from the University of Cambridge, England. Dr. Nandhakumar’s research focuses on the human and organizational aspects of information systems development, organizational consequences of information technology use and theoretical and methodological issues in information systems research. His recent work has appeared in journals such as Accounting Organization and Society, Information Technology and People, Information Systems Journal, The Information Society, European Journal of Information Systems and Journal of Information Technology.

Trust and technologies Implications for organizational.pdf ...

Page 1 of 11. Trust and technologies: Implications for organizational. work practices. Melanie J. Ashleigh a,*, Joe Nandhakumar b. a. School of Management, University of Southampton, Southampton, SO17 1BJ England, UK b. School of Management, University of Bath, Bath, BA2 7AY, England, UK. Available online 14 July ...

394KB Sizes 3 Downloads 248 Views

Recommend Documents

Article_Semantic Transfer and Its Implications for Vocabulary ...
Article_Semantic Transfer and Its Implications for Vocabulary Teaching in a Second Language_habby 2.pdf. Article_Semantic Transfer and Its Implications for ...

Recommendation and Decision Technologies For ...
should be taken into account. ... most critical phases in software projects [30], and poorly im- ... ments management tools fail to provide adequate support.

Illicit Drugs and the Terrorist Threat.Casual Links and Implications for ...
agencies involved in drug law enforcement and there is no assurance that the policies. that best implement the mission of protecting Americans from drug abuse ...

OTO: Online Trust Oracle for User-Centric Trust ...
online social network (OSN) friends to check if they have experience with the resources ... the malware authors circumvent the automated system [12], the user is still left alone to make ..... Alice's PC was installed with a pirated copy of Windows 7

Trust Management and Trust Negotiation in an ...
and a digital signature from the issuer. Every attribute certificate contains an attribute named subject; the other attribute-value pairs provide information about the ...

Casualties of war and sunk costs: Implications for ...
Jun 1, 2011 - attitude change, as well as the judgment and decision making area, are discussed. ...... 2 At the beginning of the experimental session, participants completed a 10- ..... Watch provides a relevant illustration. The ad in question consi

Implications of Health Care Reform for Inequality and ...
Nov 21, 2015 - A small fraction of individuals ... of Income and Program Participation (SIPP) and the Medical Expenditure Panel Survey ... a decomposition exercise in which I assess the relative importance in accounting for low ...... Rull (2012), an

1 Embodied Cognitive Science and its Implications for ... - PhilArchive
philosophy, artificial intelligence, psychology, linguistics, neuroscience and related ..... perceptual content by active enquiry and exploration: “Perception is not ...

Systematic Reflection - Implications for Learning From Failures and ...
Systematic Reflection - Implications for Learning From Failures and Successes.pdf. Systematic Reflection - Implications for Learning From Failures and ...

Workshop Programme - Low inflation and its implications for ...
... Banca d'Italia, Via Nazionale 91, Rome – Italy. Page 2 of 2. Workshop Programme - Low inflation and its implications for Monetary Policy - 5 October 2015.pdf.

The Implications of Embodiment for Behavior and Cognition: Animal ...
the only power source), and the mechanical parameters of the walker. (mainly leg ... 35 testify2. What is the reason for the unprecedented energy efficiency of the .... alternatives to the centralized control paradigm, which take embodiment.

Plant health and global change ? some implications for ...
recent studies on terrestrial plant health in the presence of global change factors. We summarize the links .... review is that studies of host-pathogen systems are likely to fail in .... we do not treat this issue to the same degree as for other com

Culture and the Self: Implications for Cognition ...
PREPAID CHECK CHARGE. CHECK/CARD CLEARED DATB_. YOUR NAME AND PHONE NUMBER. (If possible, scad a copy, front and back, of your ...

Patience and Altruism of Parents: Implications for Children's Education ...
Mar 29, 2015 - for Children's Education Investment. Jinghao Yang ... Production technologies for children's skill levels (θ1 and θ2) are: ... Background controls.

Implications of Capitol Lake Management for Fish and Wildlife
sites as these habitats become available (Brennan et al. ...... isolated photo-documented observation of an adult Red-eared Slider (Trachemys scripta) ...... area of freshwater lakes excludes reservoirs with a maximum storage volume of less ...... (O

Implications of Capitol Lake Management for Fish and Wildlife
The general operational strategy has been to maintain Capitol Lake at .... 3) facilitate construction, operation, and maintenance activities for the lake, shoreline,.