Machine Learning (ML) has taken the world by storm and has become a fundamental pillar of engineering. As a result, the last decade has witnessed an explosive growth in the use of deep neural networks (DNNs) in pursuit of exploiting the advantages of ML in virtually every aspect of our lives: computer vision, natural language processing, medicine or economics are just a few examples. However, NOT all DNNs fit to all problems: convolutional NNs are good for computer vision, recurrent NNs are good for temporal analysis, and so on. In this context, the main focus of N3Cat and BNN-UPC is to explore the possibilities of the new and less explored variant called Graph Neural Networks (GNNs), whose aim is to learn and model graph-structured data. This has huge implications in fields such as quantum chemistry, computer networks, or social networks among others. OBJECTIVES =========== N3Cat and BNN-UPC are looking for students wanting to work in the area of Graph Neural Networks studying their uses, processing architectures, and algorithms. To this end, the candidate will work on ONE of the following areas: - Investigating the state of the art on this area, surveying the different works done in terms of applications, processing frameworks, algorithms, benchmarks, datasets. This can be taken from a hardware or software perspective. - Helping to build a testbed formed by a cluster of GPUs that will be running pyTorch or Tensorflow. We will instrument the testbed to measure the computation workload and communication flows between GPUs. - Analyzing the communication workload of running a GNN either in the testbed or by means of architectural simulations. - Developing means of accelerating GNN processing in software (e.g., improving scheduling of the message passing) or hardware (e.g. designing a domain-specific architecture).
La principal problemática en Realidad Virtual es la imposibilidad de estudiar entornos de alta densidad de población, debido a la falta de realismo que existe entre la visualización y las sensaciones táctiles (empujones, presión). Es por ello que nos proponemos estudiar el uso del Joystick háptico para dirigir el movimiento de un avatar en un entorno con una gran densidad de agentes. Pretendemos evaluar si el uso de estos dispositivos permite aumentar el realismo y la inmersión, para avanzar en este tipo de estudios con escenarios que presenten una gran densidad de objetos dinámicos.
Companies and scientists working in areas such as finance or genomics are generating enormously large datasets (in the order of petabytes) commonly referred as Big Data. How to efficiently and effectively process such large amounts of data is an open research problem. Since communication is involved in Big Data processing at many levels, at the NaNoNetworking Center in Catalunya (N3Cat) we are currently investigating the potential role of wireless communications in the Big Data scenario. The main focus of the project is to evaluate the impact of applying wireless communications and networking methods to processors and data centers oriented to the management of Big Data. OBJECTIVES =========== N3Cat is looking for students wanting to work in the area of wireless communications for Big Data. To this end, the candidate will work on one of the following areas: - Traffic analysis of Big Data frameworks and applications, as well as in smaller manycore systems. - Channel characterization in Big Data environments: indoor, within the racks of a data center, within the package of CPU, within a chip. - Design of wireless communication protocols for computing systems from the processor level to the data center level.
Robotic Process Automation is receiving significant attention, due to the promise of improving the performance of the main processes of an organization by incorporating robots that partially perform repetitive tasks. In this project, we will consider how Process Mining can help into finding opportunities to apply Robotic Process Automation for a real case study.
Recently, one of the leaders in Robotic Process Automation has adquired one of the main process mining tools (https://www.uipath.com/newsroom/uipath-acquires-process-gold-unparalleled-process-understanding). This is a confirmation of the potential link between the field of process mining and the field of robotic process automation.
In this project we will try to find out how strong is this link. By using real data from a company that is in trying to automate its processes, the student will dig into the field of process mining to propose a methodology to unleash the application of RPA.
In this project, there is a possibility to have a grant that covers the time invested.
Web tracking technologies are extensively used to collect large amounts of personal information (PI), including the things we search, the sites we visit, the people we contact, or the products we buy. Although it is commonly believed that this data is mainly used for targeted advertising, some recent works revealed that it is exploited for many other purposes, such price discrimination, financial credibility, insurance coverage, government surveillance, background scanning or identity theft. The main objective of this project is to apply network traffic monitoring and analysis technologies to uncover the particular methods used to track Internet users and collect PI. This project will be useful for both Internet users and the research community, and will produce open source tools, real data sets, and publications revealing most privacy attempting practices. Some preliminary results of our work in this area were recently published in Proceedings of the IEEE (IF: 9.237) and featured in a Wall Street Journal article.
More info at:
Gran parte del código abierto para big-data está escrito para la JVM y actualmente gran parte de este código lo forman algoritmos de minería de datos, y otras técnicas que pueden incluirse en la especialidad de Inteligencia Artificial. Además de Java, Python, otro lenguaje interpretado, es cada vez más utilizado en cualquier entorno de programación, en concreto también en inteligencia artificial y para algoritmos de machine learning. Hay adeptos da cada uno de los dos lenguajes, basándose en su curva de aprendizaje, su portabilidad, ... Es open source, portable y soporta tareas estándar de la minería de datos, como es el pre-procesado de datos, la clasificación, agrupación en clústeres, visualización, regresión y selección de características. El objetivo principal de este proyecto es: A partir de un conjunto de algoritmos que caracterizan el trabajo a realizar en minería de datos, tales como pre-procesado de datos, la clasificación, agrupación en clústeres, visualización, regresión y selección de características, compara el rendimiento de estos dos lenguajes y ver las ventajas o inconvenientes de utilizar uno u otro, según la plataforma hardware subyacente. En concreto, x86 y ARM.
MASTER THESIS (TFM) / RESEARCH CONTRACT: The main goal of the work will be to design and develop a prototype of a recommender system based on the use, exploitation and extension of the OntoCAPE ontology for a decision-making support tool in the design of alternative processes closing material loops in chemical supply chains.
Detailed Description: This work will be done within the current hot-topic of fostering circular economy in the chemical process industries. Circular economy is based on the management of waste through the 3R (reduce, recycle, reuse) transforming the traditional productive cycle (resource-product-waste) into a circular flow (resource-product-recycled resource) thus reducing resources and waste, in line with industrial ecology field.
The main goal of the work will be to design and develop a prototype of a recommender system based on the use, exploitation and extension of the OntoCAPE ontology for a decision-making support tool in the design of alternative processes closing material loops in chemical supply chains.
A first task will be to understand the OntoCAPE ontology, being able to manipulate it extracting knowledge, using reasoning mechanisms, and possible, being able to add new information/knowledge to the ontology.
The second task will be to use the OntoCAPE ontology for helping the end-users to solve the following scenario: given some waste products, the recommender system should be able to suggest good combinations of processes, which may include waste material separations (to recover useful materials) but also more complex chemical transformations of waste to generate other materials, that can be used as resource materials in other processes. This combination of possible processes and materials is really high and it is very difficult to assess which are the best possible solutions regarding the minimization of economic costs and minimizing the ecological print.
A third task will consist on the design and implementation of interfaces with other software systems like process simulation and mathematical optimization software systems (e.g.: ASPEN, GAMS). These systems are extensively used for the analysis of processes systems and supply chains, with the aim of thoroughly evaluating and optimizing already identified transformation paths from economic, environmental, social and technical maturity points of view.
Advisor/s of the work: the work will be advised by Prof. Antonio Espuña (email@example.com) from Dept. of Chemical Engineering and CEPIMA research group (Centre for Process and Environment Engineering, https://cepima.upc.edu/en) and by Dr. Miquel Sànchez-Marrè (firstname.lastname@example.org), from Dept. of Computer Science and IDEAI (Intelligent Data Science and Artificial Intelligence, https://ideai.upc.edu/en) Research Centre.
Funding: the work will be a three-month funded during March-May 2021, within the framework of a Research project on Circular Economy in Process Engineering (CEPI), through a Research Support Personal category position at UPC (PSR, "Personal de Suport a la Recerca") starting as soon as possible.
The remuneration will be according to the dedication load of the candidates.
Contact: Antonio Espuña (email@example.com) and
Miquel Sànchez-Marrè (firstname.lastname@example.org).
Deadline: March 7, 2021
The Barcelona Neural Networking Center (BNN-UPC) is offering two positions to develop the Master Thesis in the field of Graph Neural Networks (GNN) applied to computer networking. This TFM will be fully funded and will be carried in the context of a large industrial project with a major multinational technology company.
Graph Neural Networks (GNN) have been recently proposed to learn, model and generalize over graph structured data. Computer Networks are fundamentally graphs, and many of its relevant characteristics -such as topology and routing- are represented as graph-structured data.
GNN are a central tool to apply ML techniques to Computer Networks. GNN can learn the relationship of complex network characteristics and build relevant models that can be useful to plan and manage a network. In combination with Deep-Reinforcement Learning (DRL) techniques, GNN can help developing autonomous network optimization mechanisms that will result in unprecedented performance, achieving the ultimate vision of self-driving networks.
The Barcelona Neural Networking Center (https://bnn.upc.edu) is a new research initiative of UPC with the main goal of carrying out fundamental research in the field of Graph Neural Networks applied to Computer Networks, and providing education and training to the new generation of Computer Networking students.
The main goal of this project is to develop a network monitoring system that can be used by network operators to detect bitcoin miners (or miners from other blockchain technologies) in their network. The system will rely only on network measurements obtained by standard network measurement tools and estimate interesting characteristics of detected miners, such as power consumption. How to apply: Please send an email to with your CV and academic file (pdf can be generated from the Raco).
This master's thesis aims to analyze the feasibility of a remote VR system based on the use of mobile devices with cardboard glasses and low-cost interaction devices. It will start from a system based on HTC-VIVES programmed with Unity. Different portability alternatives to the new platform will be analyzed both in terms of the rendering of the models (locally or on a server) and the limitations of the interaction and connection between students and teacher. A prototype will be developed with basic interaction techniques and its usability will be analyzed.
In many teaching applications such as the study of anatomy, the student is required to obtain a three-dimensional view of the models or structures to be analyzed. This goal is necessary for both individual study and class follow-up. There are some VR-based systems that usually use immersive VR HMD. However, these systems require an infrastructure and cost that is often not affordable.
This master's thesis aims to analyze the feasibility of a remote VR system based on the use of mobile devices with cardboard glasses and low-cost interaction devices. It will start from a system based on HTC-VIVES programmed with Unity. Different portability alternatives to the new platform will be analyzed both in terms of the rendering of the models (locally or on a server) and the limitations of the interaction and connection between students and teacher. Once the advantages/limitations have been analyzed, a prototype will be developed with basic interaction techniques and its usability will be analyzed. The models that will be visualized will be anatomical structures modeled with triangles. For more information contact the directors.
UPC is offering a new position to develop the TFG/TFM in the field of Machine Learning and Cybersecurity. This TFM will be fully funded (internship) and carried out in collaboration with the Global Security Operations Center of Nestlé and UPC.
Cybersecurity is becoming an increasingly important challenge for all companies and individuals alike. While big names used to be the main targets in the past, as people's lives move online, anyone is nowadays a potential target for any kind of cyber-attack, ranging from phishing to ransomware or serious privacy issues. In order to fight against those ever-evolving threats, Machine Learning is increasingly being used behind the scenes to design better systems that are able of self-learning to boost detection rates and boost overall resilience to unknown attacks. As AI-based solutions penetrate products across the industry, a new kind of threat that is often overlooked is becoming more and more prominent and dangerous: adversarial machine learning (AML).
AML focuses on designing specific inputs to deceive a previously trained Machine Learning models into misclassifying them for a specific purpose. One of the main flaws of any state-of-the-art Machine Learning or Deep Learning algorithms is that they assume that the nature of the data they receive is systematically benign, which is generally the case but does not hold true when an adversarial input is received. The motivation behind altering a ML model into thinking that, for example, a new sample is benign when in fact is malicious can range from pure research to more serious real-life issues such as an autonomous car wrongly classifying a stop sign (and thus provoking a fatal accident) or a wrongly diagnosed disease because of a slightly manipulated magnetic resonance image.
This problem is no exception for Cybersecurity where companies wrongly assume that once the last AI-based product is deployed in their network, their employees are safe...
The identification of the applications behind the network traffic (i.e. traffic classification) is crucial for ISPs and network operators to better manage and control their networks. However, the increasing use of encryption and web-based applications makes this identification very challenging. This problem is exacerbated with the widespread deployment of content distribution networks (e.g. Akamai) and cloud-based services (e.g. Amazon AWS). The goal of this project is to develop a traffic monitoring tool to accurately identify web services from HTTPS traffic, including Google, YouTube, Facebook, Twitter among others. The tool will combine the information from IP addresses and DNS, with novel classification methods inspired on the Google PageRank algorithm to identify encrypted traffic, even if served from Akamai, AWS or Google infrastructures. This project will be carried out in collaboration with the tech-based company Talaia Networks (https://www.talaia.io), which develops cloud-based network monitoring solutions.
How to apply: Please send an email to email@example.com with your CV and academic file (pdf can be generated from the Raco).