Podstawy projektowania interfejsów użytkownika

Dobry interfejs użytkownika to podstawa sukcesu każdej aplikacji! Poznaj różne sposoby komunikowania się użytkownika z komputerem Dowiedz się, co w kwestii interfejsu użytkownika oferują współczesne urządzenia i programy Naucz się projektować efektowne i proste w obsłudze interfejsy użytkownika Interfejs użytkownika to wizytówka każdej aplikacji komputerowej i strony WWW, a także brama zapewniająca dostęp do ich funkcji. Nawet najlepsze, najbardziej wydajne i oferujące największe możliwości oprogramowanie ma niewielkie szanse na rynkowy sukces, jeśli jego interfejs będzie toporny, brzydki lub nieintuicyjny, a używanie go — męczące. Ludzie po prostu lubią korzystać z ładnie i funkcjonalnie zaprojektowanego oprogramowania i chcą to robić w jak najprostszy sposób, nie tracąc przy tym zbyt wiele czasu na naukę. Budowanie przyjaznych interfejsów użytkownika to sztuka, którą zdecydowanie warto opanować, gdy tworzy się oprogramowanie komunikujące się z ludźmi. Pomoże Ci w tym książka "Podstawy projektowania interfejsów użytkownika", wprowadzająca w tę rozbudowaną i interesującą dziedzinę wiedzy. Znajdziesz w niej opis wybranych urządzeń wejścia–wyjścia oraz stosowanych obecnie elementów interfejsów, informacje na temat sposobów tworzenia projektów interfejsów i ich rozwoju, a także praktyczne przykłady zarówno dobrych, jak i złych rozwiązań w tej dziedzinie. Podstawowe informacje na temat UI Sposoby interakcji człowieka z komputerem Graficzne interfejsy użytkownika i ich elementy Rozwiązania multimedialne i multimodalne Zasady projektowania interfejsów użytkownika Możliwości analizy i oceny interfejsów Interfejsy stosowane w VR i AR Dobrze programujesz? Zacznij też dobrze projektować swoje interfejsy użytkownika!

Comparison of selected off-the-shelf solutions for emotion recognition based on facial expressions

The paper concerns accuracy of emotion recognition from facial expressions. As there are a couple of ready off-the-shelf solutions available in the market today, this study aims at practical evaluation of selected solutions in order to provide some insight into what potential buyers might expect. Two solutions were compared: FaceReader by Noldus and Xpress Engine by QuantumLab. The performed evaluation revealed that the recognition accuracies differ for photo and video input data and therefore solutions should be matched to the specificity of the application domain.

Evaluation of affective intervention process in development of affect-aware educational video games

M. Szwoch – 2016
In this paper initial experiences are presented on implementing specific methodology of affective intervention design (AFFINT) for development of affect-aware educational video games. In the described experiment, 10 student teams are to develop affect-aware educational video games using AFFINT to formalize the whole process. Although all projects are still in progress, first observations and conclusions may already be presented.

Methodology of Affective Intervention Design for Intelligent Systems

This paper concerns how intelligent systems should be designed to make adequate, valuable and natural affective interventions. The article proposes a process for choosing an affective intervention model for an intelligent system. The process consists of 10 activities that allow for step-by-step design of an affective feedback loop and takes into account the following factors: expected and desired emotional states, characteristics of the system and available input channels for emotion recognition. The proposed method is supported with three examples of intelligent systems that have used the process to design or to describe affective intervention. Two of them are case studies of affective video games that were designed with the proposed approach and evaluated against the non-affective versions. Another example described in the article is the affective tutoring system Gerda ,which exemplifies a more complex and sophisticated intervention model

Virtual Sightseeing in Immersive 3D Visualization Lab

The paper describes the modern Immersive 3D Visualization Lab (I3DVL) established at the Faculty of Electronics, Telecommunications and Informatics of the Gdańsk University of Technology (GUT) and its potential to prepare virtual tours and architectural visualizations on the example of the application allowing a virtual walk through the Coal Market in Gdańsk. The paper presents devices of this laboratory (CAVE, walk simulator etc.), describes methods of “immersing” a human in a virtual environment (city, building etc.) and discusses future possibilities for development (directions of research and limitations of today's hardware and software).

Acquisition and indexing of RGB-D recordings for facial expressions and emotion recognition

In this paper KinectRecorder comprehensive tool is described which provides for convenient and fast acquisition, indexing and storing of RGB-D video streams from Microsoft Kinect sensor. The application is especially useful as a supporting tool for creation of fully indexed databases of facial expressions and emotions that can be further used for learning and testing of emotion recognition algorithms for affect-aware applications. KinectRecorder was successfully exploited for creation of Facial Expression and Emotion Database (FEEDB) significantly reducing the time of the whole project consisting of data acquisition, indexing and validation. FEEDB has already been used as a learning and testing dataset for a few emotion recognition algorithms which proved utility of the database, and the KinectRecorder tool.

An extension to the FEEDB Multimodal Database of Facial Expressions and Emotions

M. Szwoch , L. Marco-Gimenez, M. Arevalillo-Herráez, A. Ayesh – 2015
FEEDB is a multimodal database that contains recordings of people expressing different emotions, captured by using a Microsoft Kinect sensor. Data were originally provided in the device’s proprietary format (XED), requiring both the Microsoft Kinect Studio application and a Kinect sensor attached to the system to use the files. In this paper, we present an extension of the database. For a selection of recordings, we also provide a frame by frame analysis in text format, that allows one to use the data directly for classification purposes. The data provided includes many different features computed from the original recordings, namely tracking status, 6 animation units, head position, head pose (pitch, roll and yaw) and 100 tracked points. These were extracted by using the Microsoft Face Tracking capabilities and are provided for each frame.

Design Elements of Affect Aware Video Games

M. Szwoch – 2015
In this paper issues of design and development process of affect-aware video games are presented. Several important design aspects of such games are pointed out. A concept of a middleware framework is proposed that separates the development of affect-aware video games from emotion recognition algorithms and support from input sensors. Finally, two prototype affect-aware video games are presented that conform to the presented architecture and use behavioral information to dynamically adjust the level of difficulty to the recognized player’s emotion.

Detection of Face Position and Orientation Using Depth Data

In this paper an original approach is presented for real-time detection of user's face position and orientation based only on depth channel from a Microsoft Kinect sensor which can be used in facial analysis on scenes with poor lighting conditions where traditional algorithms based on optical channel may have failed. Thus the proposed approach can support, or even replace, algorithms based on optical channel or based on skeleton or face tracking information. The accuracy of proposed algorithms is 91% and was veried on Facial Expressions and Emotions Database using 169 recordings of 25 persons. As the processing time is below 20ms per frame on a standard PC, the proposed algorithms can be used in real-life applications. The presented algorithms were validated in a prototype application for user emotion recognition based on depth channel information only.

Emotion Recognition for Affect Aware Video Games

In this paper the idea of affect aware video games is presented. A brief review of automatic multimodal affect recognition of facial expressions and emotions is given. The first result of emotions recognition using depth data as well as prototype affect aware video game are presented