Robotic Process Automation is receiving significant attention, due to the promise of improving the performance of the main processes of an organization by incorporating robots that partially perform repetitive tasks. In this project, we will consider how Process Mining can help into finding opportunities to apply Robotic Process Automation for a real case study.
Recently, one of the leaders in Robotic Process Automation has adquired one of the main process mining tools (https://www.uipath.com/newsroom/uipath-acquires-process-gold-unparalleled-process-understanding). This is a confirmation of the potential link between the field of process mining and the field of robotic process automation.
In this project we will try to find out how strong is this link. By using real data from a company that is in trying to automate its processes, the student will dig into the field of process mining to propose a methodology to unleash the application of RPA.
In this project, there is a possibility to have a grant that covers the time invested.
From a snow avalanche model developed at the UPC, which simulates the dynamics of this phenomenon, we want to do a full validation of the model to allow specialists in avalanches of the ICGC to use it as a tool in the decision-making process.
The Validation Verification and Accreditation of a model is essential to be able to effectively use a model in production for decission making. The project aims to validate the model and the implementation so that the end result reproduces the natural dynamics of the phenomenon. With this validation model will be used as a tool in support of avalanche team of the ICGC. In the development of the project will help from specialists of the ICGC who are taking part in this process of validation.Advanced DOE techniques will be applied during the project.
The goal of the project is to create an interactive tool that lets the user visually compare the outcomes of different weather forecast data sources.
Weather forecasts are not 100% reliable. However, it is difficult to determine which service provides the most accurate data. The goal of the project is to gather, process, and clean data from different sources, and create a tool that facilites the visual analysis of data from different sources.
Walking in Place is an intuitive and popular method to navigate virtual environments. Its main advantage is that it can lessen motion sickness when compared to using other devices such as hand held controllers. However, when navigating an immersive virtual environment, using a Head Mounted Display, the user looses all references to the real world, which makes it difficult to stay within the same physical spot. In the thesis, we want to explore how walk in place can be combined with real walking to achieve a smooth transition between navigation in large and small environments. We also want to investigate different approaches to guarantee that the user will stay within the limits of the space tracked by the HTC VIVE bases.
FHIR (Fast Healthcare Interoperability Resources) is a set of standards developed by HL7 International to facilitate eHealth information interoperability and use. On the other hand, different efforts are in place to improve the representation (more compression and security) of Genomic information, such as those from the GA4GH (Global Alliance for Genomics and Health) and the MPEG standardization committee. The DMAG (Distributed Multimedia Applications Group) of the Computer Architecture Department of the UPC is involved in the specification of some of these new standards. The objective of this project is to integrate genomic information into EHRs (Electronic Health Records). For this purpose, the different standards for the representation of medical and genomic information will be analysed, and FHIR will be used to faclitate that integration. Finally, a small prototype will be developed, probably making use of existing open source software. The results of this work could be contributed to one of the different standardization organizations for its consideration.
The weather - dependent routing algorithm will be integrated into a service for the logistic platforms of food distribution companies enabling self-response capabilities during severe weather events. Using a Multi-Hazard Early Warning System (MH-EWS) as an input, effects of weather will be crossed with a representation model of the road network. The model will be used to provide alternative routes with anticipation to the logistic demand (total freight to be moved between warehouses). Resulting routes should be shown over a map.
In a world that has been seeing a gradual miniaturization/acceleration of wireless communications to meet the explosive growth of data traffic, the NaNoNetworking Center in Catalunya (N3Cat) aims to shape the wireless networks of the future ¿even beyond the 5G and Internet of Things (IoT) paradigms. The main focus of N3Cat is to conceive ultrafast, ultra-small, and ultra-efficient wireless communications to enable new and disruptive applications such as wireless on-chip networks, reconfigurable metamaterials, body area networks, or even intra-body networks.
N3Cat is looking for students wanting to work in the area of ultrafast and short-range wireless communications and applications. To this end, the candidate will work on one of the following areas:
· Development of antenna structures based on graphene or other nanomaterials with unprecedented miniaturization and reconfiguration capabilities.
· Characterization of the wireless channel in challenging environments for applications such as on-chip networks or reconfigurable metamaterials.
· Design of generic communication methods and protocols capable of working under ultra-stringent area and power conditions, yet with ultra-high speeds.
· Design of custom protocols and new architectures uniquely suited to applications such as on-chip networks or reconfigurable metamaterials.
The goal of this project is to create a security module for the Open Daylight platform  using a tool similar to Bitcoin called Hyperledger ; both of them are open-source projects. Taking advantage of the security properties of Bitcoin (distributed, trustless, auditable), it is possible to securely assign IP addresses among its owners, allowing the detection of parties that are not using their legitimate IP addresses. In addition, with the extended capabilities of Hyperledger, we can distribute more information rather than just IP prefixes, such as public keys, configuration parameters, etc. The student will get involved in the joint development team between Cisco Systems (HQ, San Jose CA) and UPC.
The goal of this project is to create a security module for the Open Daylight platform  using a tool similar to Bitcoin called Hyperledger ; both of them are open-source projects. Taking advantage of the security properties of Bitcoin (distributed, trustless, auditable), it is possible to securely assign IP addresses among its owners, allowing the detection of parties that are not using their legitimate IP addresses. In addition, with the extended capabilities of Hyperledger, we can distribute more information rather than just IP prefixes, such as public keys, configuration parameters, etc.
The student will get involved in the joint development team between Cisco Systems (HQ, San Jose CA) and UPC.
The project will consist in (1) Becoming familiar with Hyperledger (2) Developing code in Hyperledger to manage the IP prefixes (3-optionally) Interfacing with Open Daylight.
How to apply: Please send an email to email@example.com including your academic file and CV.
Deep learning techniques are commonly used as a black box, and it is difficult to understand what is really happening in training procedures. The goal of the project is to facilitate the understanding of machine translation systems.
The application must be created as an interactive document where the process of machine translation is explained and visualized at the same time. Features for interactive exploratory analysis must be added to help the users understand how the change of parameters affects the performance. The result should be an Explainable AI document (e.g. something like the ones appearing here: https://visxai.io/).
Web tracking technologies are extensively used to collect large amounts of personal information (PI), including the things we search, the sites we visit, the people we contact, or the products we buy. Although it is commonly believed that this data is mainly used for targeted advertising, some recent works revealed that it is exploited for many other purposes, such price discrimination, financial credibility, insurance coverage, government surveillance, background scanning or identity theft. The main objective of this project is to apply network traffic monitoring and analysis technologies to uncover the particular methods used to track Internet users and collect PI. This project will be useful for both Internet users and the research community, and will produce open source tools, real data sets, and publications revealing most privacy attempting practices. Some preliminary results of our work in this area were recently published in Proceedings of the IEEE (IF: 9.237) and featured in a Wall Street Journal article.
More info at:
L'objectiu del projecte és dissenyar i avaluar un sumador aproximat de nombres en coma flotant, que pugui ser útils en entorns que poden suportar una certa pèrdua de precisió en els càlculs. En primer terme, caldrà estudiar l'estat de l'art, tant en dades de tipus enter com en dades de tipus coma flotant. Després es dissenyarà un sumador aproximat de nombres en coma flotant. S'especificarà en VHDL i s'avaluaran les seves característiques en termes de precisió, retard, àrea, consum energètic,...
The Barcelona Neural Networking Center (BNN-UPC) is offering two positions to develop the Master Thesis in the field of Graph Neural Networks (GNN) applied to computer networking. This TFM will be fully funded and will be carried in the context of a large industrial project with a major multinational technology company.
Graph Neural Networks (GNN) have been recently proposed to learn, model and generalize over graph structured data. Computer Networks are fundamentally graphs, and many of its relevant characteristics -such as topology and routing- are represented as graph-structured data.
GNN are a central tool to apply ML techniques to Computer Networks. GNN can learn the relationship of complex network characteristics and build relevant models that can be useful to plan and manage a network. In combination with Deep-Reinforcement Learning (DRL) techniques, GNN can help developing autonomous network optimization mechanisms that will result in unprecedented performance, achieving the ultimate vision of self-driving networks.
The Barcelona Neural Networking Center (https://bnn.upc.edu) is a new research initiative of UPC with the main goal of carrying out fundamental research in the field of Graph Neural Networks applied to Computer Networks, and providing education and training to the new generation of Computer Networking students.
The main goal of this project is to develop a network monitoring system that can be used by network operators to detect bitcoin miners (or miners from other blockchain technologies) in their network. The system will rely only on network measurements obtained by standard network measurement tools and estimate interesting characteristics of detected miners, such as power consumption. How to apply: Please send an email to with your CV and academic file (pdf can be generated from the Raco).
The identification of the applications behind the network traffic (i.e. traffic classification) is crucial for ISPs and network operators to better manage and control their networks. However, the increasing use of encryption and web-based applications makes this identification very challenging. This problem is exacerbated with the widespread deployment of content distribution networks (e.g. Akamai) and cloud-based services (e.g. Amazon AWS). The goal of this project is to develop a traffic monitoring tool to accurately identify web services from HTTPS traffic, including Google, YouTube, Facebook, Twitter among others. The tool will combine the information from IP addresses and DNS, with novel classification methods inspired on the Google PageRank algorithm to identify encrypted traffic, even if served from Akamai, AWS or Google infrastructures. This project will be carried out in collaboration with the tech-based company Talaia Networks (https://www.talaia.io), which develops cloud-based network monitoring solutions.
How to apply: Please send an email to firstname.lastname@example.org with your CV and academic file (pdf can be generated from the Raco).