- Research
- Open access
- Published:
Value creation through trust in technological-mediated social participation
Technology, Innovation and Education volume 2, Article number: 5 (2016)
Abstract
In this article, we advocate for the use of a social-technical model of trust to support interaction designers in further reflecting on trust-enabling interaction design values that foster participation. Our rationale is built upon the believe that technological-mediated social participation needs trust, and it is with trust-enabling interactions that we foster the will for collaborate and share—the two key elements of participation. This article starts by briefly presenting a social-technical model of trust and then moves on with establishing authors rational that interconnects trust with technological-mediated social participation. It continues by linking the trust value to the context of design critique and critical design, and ends by illustrating how to incorporate the trust value into design. This is achieved by proposing an analytical tool that can serve to inform interaction designers to better understand the potential design options and reasons for choosing them.
The socio-technical model of trust
The proposed model, which we advocate for be use as a design tool in informing the design of social participatory services, depicts trust as a construct informed by seven individual qualities. The model determines the extent to which one relates with one’s social and technical environment. This model, see Fig. 1, is based on the combination of the unification of Davis’s and Venkatesh’s technology acceptance models (Venkatesh et al. 2003), along with an extensive literature review on trust (Sousa and Lamas 2011), and was complemented with participatory design sessions (Sousa et al. 2011). The resulting model (after validation) takes into consideration certain observable qualities of trust that help to determine:
-
User’s intentions of trust (motivation and willingness);
-
User’s incentives to use and accept certain technologies (competency and predictability);
-
User’s supports to engage in giving and taking actions (benevolence, reciprocity and honesty).
The individual observable qualities of trust are
-
Motivation represents the degree to which an individual believes (even under conditions of vulnerability and dependence) h/she has the ability to perform specific beneficial actions when using a computer.
-
Willingness reflects positive or negative feelings about performing a given action while considering the risk and incentives.
-
Competency reflects the degree of ease of use when associated with the use of the system.
-
Predictability represents a user’s confidence that the system will help him to perform a desired action in accordance with what is expected.
-
Benevolence reflects a user’s perception that most people share similar social behaviours and sharing values.
-
Reciprocity represents the degree to which an individual sees oneself as a part of a group.
-
Honesty reflects an insurance quality when facing apprehension, or even fear with the possibility of being deceived.
Trust as a value creation in system design
This section presents the authors argument on why designing for trust is important and why and how can trust be a key value to foster social participation. It concludes by eliciting the key elements of designing for trust.
Today our personal and business relations are build upon technological-reliant ecosystems which aggregate a variety of interaction context and settings. Those enables us to establish new interactions scenarios that are build upon a even range of configurations that mix technologies, applications and actors in a common space. Such space is presented to us as an aggregator of actors and actions that enable certain behaviours like sharing or defining our connections, our purpose of learning and our purpose of participation (Kolko 2011).
All this requires new rules of behaviour, new notions of ownership and commitments. Those represent as well new spaces that aggregate different types of organisations, economical services and social network relationships. We might say that those technological-mediated services facilitate the development of new social capital patterns, where two or more persons can enjoy the benefit (or cost) of sharing a value, a good, time or skill (Light and Miskelly 2014). However, those spaces can also differ greatly from being simple kinds of economical transaction processes, as they evolve mostly by not avoiding feelings of commitment but by promoting it.
These new sharing economies (as some might call them) arise mostly from establishing new forms of collaboration, new notions of ownership and relationships, which lead to a variety of sharing processes like creation, production, distribution, economical trade, consumption of goods and services in just one space. All this demands for interaction spaces that prime for their ability to “facilitate of actions” in order to be successful. This ability to “facilitate of actions” can be roughly translate into the ability to create social capital (Tsai and Ghoshal 1998; Fukuyama 1995) and this demands trust, mutual reciprocity and norms of action.
Take for example services like VGI Crowdsourcing geo-data. Its success lies in peer-to-peer volunteer contributions, and it increases through the system ability to incentive user’s contribution. Thus concepts like willingness, trust and credibility need to be addressed (Coleman 2010). Or in another example, consider carpooling services which, according to a study reports trust is the main problem hindering their popularity (Correia and Viegas 2011; Massaro et al. 2009). Or even a study reporting significant relations between user’s trust and their attitudes towards sharing in open spaces in education settings (Sousa et al. 2011; Lorenz et al. 2013).
Indicating that in spite of having positive expectations towards learning in open spaces, trust was one of their main concerns when regarding sharing and contributing in those open spaces. For instance, they claimed to be more willing to share if they associate those activities with trust attitudes like willingness, predictability, honesty and benevolence. The above examples establish a direct link between sharing and trusting, which indicates the need for providing new models and visions where trust plays an important role in encouraging sharing interactions. Thus, our argument is for the use of a social-technical model of trust to contribute for promoting self-regulated participatory actions.
The key elements of designing for trust
As referred above, we see trust as a key value for promoting self-regulated participatory actions. The challenge to design for trust relies on how designers and researcher perceive the trust value within the interactions. The trust value can be seen from two distinctive perspectives: one that reflects a strict operational standpoint and another that sees trust as an internal quality.
The operational view sees trust as a value to support the users rational decisions within a specific moment in time, based upon knowledge of the possible rewards for trusting or not. Those notions are often associated with information-based systems, usually translated in a sort of general model of trust, or in policies that reflect norms and rules of behaviour. Trust in this context is often seen as the willingness to share or not to share information and is very much connected with the notions of security, reputation, reliance and privacy.
The other approach sees trust as an internal quality that changes over time. Trust is a reflection of a state of mind, a confidence and one’s predisposition to trust another. These notions are based on a set of perceptions of othersFootnote 1 as 'trustworthy' (Dong 2010; Mayer et al. 1995; Preece 2001; Whitworth and De Moor 2003; Camp 2002).
Although both views seem divergent in nature, we believe that they complement each other, as design for trust should not be solely focused upon designing for reliable systems with the main aim of preventing risks. Instead, it should also be focused on examining the trust key factors that support a social system that sits upon a technical base and eventually leading to a greater understanding of how individuals interact with systems and towards the extensive impact of trust in design for sustainable and self-regulated interactions (Sousa et al. 2014a, b).
The main challenge lies in how to combine those two social and technical aspects into one. A common dominator among both perspectives is that trust contemplates a complex two-way relationship between individuals that are constituents of a society (Tyler and Degoey 1996; Fukuyama 1995); Mayer et al. 1995; McKnight and Chervany 2002). Or that trust represents a calculative orientation towards risk, by trusting we assume a potential gain, while by distrust we are avoiding a potential loss (Gambetta 1998; Bachrach and Gambetta 2001). As well as a violation of trust usually lies not in a simple isolated interpersonal event, but rather in a significant event that is likely to have impact on the parties and on the relationship.
In sum, trust is a reflection of a state of mind, trust reflects a confidence and the predisposition to trust another. This confidence or predisposition to trust usually are based on a set of perceptions of another as 'trustworthy' (Dong 2010; Mayer et al. 1995; Preece 2001). This ability to observe those reinsurance elements assure a balance between an individual’s commitments and the risk involved, in our model, we call it observable qualities see above “The socio-technical model of trust” section. Those reinsurance mechanisms or observable qualities in our model are represented by combining social and technical trust-enable design characteristics in the system design.
Incorporating the trust value into design
In an abstract sense, 'value' of an artefact can be referred to a reason or the purpose for its creation. A more specific understanding would place 'value' as something cared for or having desirable qualities by the designers or the users. Let us assume that value-centric design implies identifying the intended values, and that design is defined as the intent of the designers to create some type of value through artefacts (Cockton 2005). This view states that values are incorporated into the design outcomes and are in alignment with other important design characteristics like the goals of the stakeholders, context requirements and the future expected desired functionalities.
In this sense, designing implies the identification of different lenses on how we see the artefact and becomes a starting point for inviting design critique, which helps uncover opportunities for improvements (Blevis et al. 2007; Kolko 2011). We can also do critical design by ensuring that design actions lead to the inclusion of a value in 'future ways of being' (Blevis 2006; Arakelyan et al. 2014) and thus, facilitate either reflection-in-action or reflection-post-action.
Base on this argument, and in the argument that when designing for trust we need to combine the social and technical aspects of trust into one, we proposed a new design methodology which we envision to serve mainly for two main applicabilities:
-
One serves to help interaction designers better understand the potential design options and the reasons for choosing them;
-
The other serves to help interaction designers assess the existing design solutions for their intentional creation of value.
This section starts by describing the proposed trust-enabling design analytical tool and then illustrates through examples how to assess systems from two distinctive lenses.
The first example pretended to illustrate how to do critical design by performing a comparative inspection technique of two peer-production platforms: Wikipedia and Wordpress. The second example pretends to illustrate how to do design critique by deconstructing a proposed sharing service, called “BiB” and illustrating potential trust breakdowns.
The rationale behind the choice of these examples was due to the authors’ ongoing work on the LearnMix project, which aims to re-conceptualise the e-textbook as a collection of professional and user-contributed content available on a wide variety of devices (Lamas et al. 2013). The LearnMix project deals with a complex set of concepts ranging from the definition of the envisioned e-textbooks to the specific interactions that should be enabled by it (Lamas et al. 2013). In this case, Wikipedia and Wordpress as user generated content web platforms represented potential solutions which can influence design decisions. The assessment of a sharing service helped us to better visualise certain important trust design pitfalls in what concerns fostering sharing and cooperation attitudes.
The proposed analytical tool
The dimensions of this proposed methodology aims mainly to serve as a measuring tape for assessing trust-enabling interactions design features and pitfalls. This analytical tool was supported by MacLean et al. (1991) design space analysis concepts; where the basic building blocks of the design space are derived from questions, options and criteria, and the criteria are the means to assess and compare the options.
The proposed design space includes two distinct components.
-
The static component of the design space includes the main driven question, the sub-set of questions and the set of trust analytical dimensions, as described in Fig. 2, and
-
The dynamic component of the design space is dependent on the assessment context.
Examples of use of the trust-enable analytical tool for design critique
To illustrate how this technique could be applied for design criticism, we describe how we used the trust-enable design space analytical tool to assess two peer-production platforms: Wikipedia and Wordpress.
The assessment tool was presented as a matrix, where the static elements of the proposed design space were positioned in the top row, and the dynamic elements were position in the first left column (see Fig. 3 below).
Each feature was assessed within each analytical dimension. A three-value rating scale was used as measurement tool.Footnote 2 Then, as a final step, the results were sorted in ascending order. This procedure was tested using three interaction design experts.
In the case of Wikipedia, the question driving the analysis was “What influences the user’s predisposition to trust Wikipedia” and in the case of the Wordpress the driving question was “What influences the user’s predisposition to trust Wordpress?”.
The static analytical dimensions for assessing Wikipedia were represented by the trust 7 observable qualities, as described in Fig. 2, and the dynamic analytical dimensions represent Wikipedia’s features gathered from Wikipedia page named Category:Wikipedia features.Footnote 3 Again the same procedure was done for assessing Wordpress trust-enabling interactions design qualities, but in this case, the dynamic elements were represented by Wordpress features gathered from a page named Wordpress features.Footnote 4
Each feature was crossed with each dimension and ranked in accordance to the three-value rating scale. Each time we need to rank a feature, and we consulted the trust 7 observable qualities definitions.
The main results achieved from applying this comparative analysis technique and enabled us to perceive that both platforms implemented designed features to encourage the trust value in sharing and in facilitating actions (e.g. “editor engagement”, “enabling viewing and restoring deleted pages” or “publishing with ease” features). Both included online moderating user roles to ensure that the collaboration happens in accordance to what was expected (e.g. “patrolled pages”, “user access levels”, “user management” or ”administration pages). Also, both platform provides mechanisms to foster cooperation by creating incentive for contributing features (e.g. “community building”, “built-in comments”, “article feedback mechanisms” or “contributions from users”).
Another trust fostering feature is represented in a form of “notification” or “page curation” or even “editor engagement”. These serve to highlight interactions and incentive new users to participate. Finally, results revealed as well that explicitly stated terms of privacy increased a users privacy awareness, which helps to establish more transparent and honesty policies. This, complemented with clear and supportive communities, like forums, activities guide, tutorials reflect honesty, competency and predictability.
Examples of use of the trust-enable analytical tool for critical design
To illustrate how critical design analysis can be applied, we bring the example of the sharing service proposal called “BiB”. This service was designed by HCI master level students for a sustainability course taught at Tallinn University, Estonia. The mobile tools main aim is to facilitate sustainable sharing practices and promoting community synergy in the society.
To examine the service, we used a trust-enabling design analytical tool Fig. 4. A similar design space matrix approach was applied herein, but in this case, the static elements of the proposed design space are positioned in the first left column and the dynamic elements were position in the top row.
To better visualise how this technique was applied, see figure below.
Another difference between using this analytical tool for critical design and design critique is that instead of rating each feature according to a three-value rating scale, as results we enumerate service potential trust breakdowns and provide possible solutions.
The main aim of this procedure was to enumerate possible trust design pitfalls. As designers propositions did not intentionally include trust design frameworks.
This information was then used to inform on possible meaningful trust-enabling interaction design proposition. Trust design proposition which address the need for fostering trust as “BiB” designers claim. Facilitating sharing (give and take) and promoting goods exchange community synergy.
For instance, the overall interaction design activities proposed few incentives to leverage users to consider the risk of using “BiB” to exchange gadgets. As a possible solution we propose to (re)design the “BiB” service by providing clear motivational hints on why to choose the tool and what users gain by using it. Also, it should enable incentive to trial and service exploration to trigger the willingness to trust. Regarding competency qualities, this is somehow shown through “BiB” aesthetics and ease of use. But it can be improve if the designer provides example information or examples on usage practices. Again, “BiB” coherent design can be improved and ensure more predictability if complemented with user support sharing mechanisms like support forums and help guidelines. No benevolent design-related qualities were found herein, they mention although wishing for a “forgiveness” feature. We advise them to provide as well caring, kindness/goodwill mechanisms through emotion “buttons”. Regarding reciprocity, they proposed to provide feedback through recommender system, which in our opinion is not enough. Maybe they could enable people to testify on others or on the tools behalf. They should create as well small support group communities with similar interests in order to enable friend (or friend of a friend) recommendations. Honesty hints are missing, they created a feature called the “circle of trust” but only for listing contacts. They should also add to that feature users cues on what the service does to prevent the possibility of deceiving. We suggest to complement this feature by define clear rules and responsibility mechanisms that ensure expulsion to those who not follow the rules, and by creating warning and advice lists. They should provide as well tool usage general statistics.
Conclusions
The aim of the article was to advocate for the use of a social-technical model of trust to support interaction designers on further reflecting on trust as value creation in system design. This rationale was build upon illustrating how interaction designers can reflect on values through the application of a value-centric design space analytical lens.
In this sense, we perceive two main potential applications for the proposed design space. One, as the examples above demonstrate, is on the assessment of existing systems and supporting the designer in perceiving what features could leverage trust-enabling interactions. Another possible use is to inform the design process in regards to the possibilities or attributes that could facilitate trust-enabling interaction processes.
In this regard, we see the design space more as supportive tool for a design process predicated by an humanistic approach based on human-centred design (HCD) and participatory design (PD), than as a substitute for logical or engineering-based design.
More, the example provided has proven to be a useful value-sensitive design rationale visualisation tool. However, it can be time consuming if the process is followed in meticulous detail. For example, if we attempt to make design process a structure process only. Such views can include the negative effect of killing creativity or hindering progress within the project. Thus, our suggestion for using this analytical tool would be to consider them as additional inputs for value incorporation and reflection, rather than final recommendations showing exactly what attributes to design for.
Notes
A society, a person and or a technological artefact.
Three-value rating scale measurement: 1 means that 'it contributes to the intended value', −1 means that 'it diminishes the intended value’ and 0 for not applicable'.
References
Arakelyan A, Smorgun I, Sousa S (2014) Incorporating values into the design process: the case of e-textbook development for estonia. Advances in web-based learning, Lecture notes in computer science. Springer, Berlin, pp 274–281
Bachrach M, Gambetta D (2001) Trust as type detection. In: Castelfranchi C (ed) Trust and deception in virtual societies. Kluwer Academic Publishers, Faridabad
Blevis E (2006) Advancing sustainable interaction design two perspectives on material effects. Des Philos Pap 4:1–19
Blevis E, Lim YK, Roedl D, Stolterman E (2007) Using design critique as research to link sustainability and interactive technologies. Springer, Berlin
Camp LJ (2002) Designing for trust. Trust, reputation, and security. Springer, Berlin, pp 15–29
Cockton G (2005) A development framework for value-centred design. In: CHI ’05 extended abstracts on human factors in computing systems–CHI ’05. ACM Press, New York, p 1292. http://dl.acm.org/citation.cfm?id=1056899
Coleman DJ (2010) Volunteered geographic information in spatial data infrastructure: an early look at opportunities and constraints. In: GSDI 12 world conference
Correia G, Viegas JM (2011) Carpooling and carpool clubs: clarifying concepts and assessing value enhancement possibilities through a stated preference web survey in lisbon, portugal. Transp Res Part A Policy Pract 45(2):81–90
Dong Y (2010) The role of trust in social life. In: Yan Z (ed) Trust modeling and management in digital environments: from social concept to system development. IGI Global, Finland, pp 421–440
Fukuyama F (1995) Trust: the social virtues and the creation of prosperity. Free press, New York (Technical report)
Gambetta D (1998) Trust making and breaking co-operative relations. In: Gambetta D (ed) Can we trust trust?. Basil Blackwell, Oxford, pp 213–237
Kolko J (2011) Endless nights-learning from design studio critique. Interactions 18(2):80–81
Lamas D, Väljataga T, Laanpere M, Rogalevits V, Arakelyan A, Sousa S, Shmorgun I (2013) Foundations for the reconceptualization of the e-textbook. In: Proceedings of the international conference on e-learning ICEL 2013, p 510
Light A, Miskelly C (2014) Design for sharing. Research report, Northumbria University, EPSRC digital economy sustainable society network+). https://designforsharingdotcom.files.wordpress.com/2014/09/design-for-sharing-webversion.pdf. Accessed Jan 2014
Lorenz B, Sousa S, Tomberg V (2013) Privacy awareness of students and its impact on online learning participation—a case study. Open and social technologies for networked learning. Springer, Berlin, pp 189–192
MacLean A, Young RM, Bellotti V, Moran TP (1991) Questions, options, and criteria: elements of design space analysis. Hum Comput Interact 6(3–4):201–250
Massaro DW, Chaney B, Bigler S, Lancaster J, Iyer S, Gawade M, Eccleston M, Gurrola E, Lopez A (2009) Carpoolnow-just-in-time carpooling without elaborate preplanning. In: WEBIST, pp 219–224
Mayer RC, Davis JH, Schoorman FD (1995) An integrative model of organizational trust. Acad Manag Rev 20(3):709–734
McKnight D, Chervany N (2002) Trust and distrust definitions: one bite at a time. In: Falcone R, Singh MP, Tan Y (eds) Trust in cyber-societies: integrating the human and artificial perspectives. Springer, Berlin, pp 27–54
Preece J (2001) Etiquette, empathy and trust in communities of practice: stepping-stones to social capital. J Univers Comput Sci 10(3):194–202
Sousa SC, Tomberg V, Lamas DR, Laanpere M (2011) Interrelation between trust and sharing attitudes in distributed personal learning environments: the case study of lepress ple. Advances in web-based learning-ICWL 2011. Springer, Berlin, pp 72–81
Sousa SC, Lamas D (2011) Emerging trust patterns in online communities. In: Internet of things (iThings/CPSCom), 2011 international conference on and 4th international conference on cyber, physical and social computing. IEEE, pp 313–316
Sousa S, Lamas D, Dias P (2011) A framework for understanding online learning communities. In: Leung H, Popescu E, Cao Y, Lau R, Nejdl W (eds) ECEL, pp 1000–1004
Sousa S, Dias P, Lamas D (2014a) A model for human–computer trust: a key contribution for leveraging trustful interactions. In: 2014 9th Iberian conference on information systems and technologies (CISTI). IEEE, pp 1–6
Sousa S, Shmorgun I, Lamas D, Arakelyan A (2014b) A design space for trust-enabling interaction design. In: Proceedings of the 2014 mulitmedia, interaction, design and innovation international conference on multimedia, interaction, design and innovation. ACM, Oxford, pp 1–8
Tsai W, Ghoshal S (1998) Social capital and value creation: the role of intrafirm networks. Acad Manag J 41(4):464–476
Tyler TR, Degoey P (1996) Beyond distrust: getting even and the need for revenge. In: Kramer R, Tyler T (eds) Trust in organisations: frontiers of theory and research. SAGE publications Inc., California, p 429
Venkatesh V, Morris MG, Davis GB, Davis FD (2003) User acceptance of information technology: toward a unified view. MIS Q 27(3):425–478
Whitworth B, De Moor A (2003) Legitimate by design: towards trusted socio-technical systems. Behav Inf Technol 22(1):31–51
Authors' contributions
SCS contributed on trust-related sections and conceived the opposed analytical tool. DL contributed with section incorporate participated in the study design and coordination, and PL helped to draft the manuscript. All authors read and approved the final manuscript.
Acknowledgements
This work was supported by FCT and Universidade Aberta. The view expressed herein do not necessarily represent the views of FCT or Universidade Aberta. The authors thank for Arman Arakelyan and Ilja Smorgun contribute, through which the proposed design space analysis was refined.
Competing interests
The authors declare that they have no competing interests.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.
About this article
Cite this article
Sousa, S., Lamas, D. & Dias, P. Value creation through trust in technological-mediated social participation. Technol Innov Educ 2, 5 (2016). https://doi.org/10.1186/s40660-016-0011-7
Received:
Accepted:
Published:
DOI: https://doi.org/10.1186/s40660-016-0011-7