The SOCS Undergraduate Summer Research Symposium 2008 is an opportunity for undergraduate students working with SOCS professors during the summer of 2009 to showcase their research and compete for prizes, which will be decided by a jury of SOCS professors. All undergraduate summer students may participate, whether their research arrangement is formal (e.g., NSERC USRAs or COMP-396 students) or informal, and whether paid or unpaid.
Monday Aug. 31 - The day of the symposium, in Trottier 0100
State-Space Control and Python Bindings for the Amphibious AQUA
robotThe project involved the development of a later user interface for the
control of robotic systems, and the supporting software and networking
infrastructure to evaluate the approach. The project examined a novel
approach to trajectory that allows a user to easily express what would
happen at specific locations. The implementation of this idea includes,
in addition, a general-purpose scripting-based software architecture for
robot control. This project developed a Python language interface
(front end) for the AQUA robot family (see http:/www.aquarobot.net)
that included application-specific code modules and a C-to-python
interface. Prior to this project, these robots were driven using a
complex C++ API. Much of this API was simplified and abstracted in
order to provide an easy-to-learn interface to the robot. This project
involved developing, implementing, and testing the Python interface, a
graphical user interface, and a network-based controller for the robot.
Variable selection with missing data due to dropout
Seizure Detection using Behavior Analysis
Application of Gaussian Processes on epileptic EEG
Determine the role of the human protein proteininteraction network in breast cancer
Cloning genes in the Naked mole-rat without reference genome
Multi-Model Exploration
Designing Efficient Markets
Friction Simulation Using Gauss-Seidel-like Algorithm for Animation
Proximity and interpenetration depth queries for deformable models
Efficient stiffness computation for hexahedral finite elements
C-FEM : Composite Finite Elements on the iPhone
Perception, Memory, Imagination
Calculus, Combinatorics and (Quantum) Computation
Schizophrenia is a rare and serious disease for which no global and effective
treatment is known. A recent multiphase study was carried out that treated
patients in a semi-random fashion and recorded their progress. A high drop
out rate needs to be dealt with. Through data mining, we hope to be able to
understand which of the patients' characteristics explain their progress (or lack
thereof). Of particular interest are the variables that interact with treatment
to explain a patient's progress.
To this end, a greedy search that compares candidate models by their Bayesian
Information Criterion will be used. Because of its monotone pattern, missing
data can be imputed by using Fully Conditional Specification. We explore two
methods that take into consideration multiple imputations in order to carry out
the final model selection.
Some preliminary results on simulated data (with added error and miss-
ingness) are shown.
Epilepsy is a chronic neurological disorder affecting 1% of the world
population. The major symptoms are seizures, characterized by sudden
burst
of synchronous activity in the brain. Most of the time the cause is not
known, making the development of treatments difficult. Researchers have
been interested in electroencephalograms (EEG) because of their
information
about the brain activity, but the complexity and the amount of noise
in the
signal hardens its understanding. To analyze them, features are
extracted
from the data and used to determine if the patient is in an inter-ictal,
pre-ictal, ictal or post-ictal phase. The method introduced measure the
difference in the behavior of the features as a way to detect the
occurrence
of seizures instead of using the features as is.
I present an application of Gaussian Processes to detect epileptiform
activity from electrophysiological recordings. In real life, systems that will be
used for this kind of detection are most likely to be limited in resources. I also
show a method to use Gaussian Processes to discover these activities key
characteristics in order to produce a compact state representation of epileptic
events.
Recent work have shown biomarkers differentiating breast cancer prognosis from the
protein protein interaction (PPI) are more reproducible and accurate across dataset
than conventional predictors. How ever it is not clear whether these findings are
specific to the human PPI itself. Here we examined gene sets on the human PPI
network and it's relevance to breast cancer outcome. The analysis of these gene sets
on 2 cohorts of breast cancer patient showed an enrichment of cancer relevant gene
sets on the network compared to random. Further analysis revealed the cause of this
enrichment is due to the specific network location of a small number of genes.
Furthermore, gene sets capable of identifying subtypes and prognosis within subtype
also show a similar result.
Naked mole-rats can live over 28 years, significantly longer than
similar-sized rodents like mice that do not commonly live more than 4
years. Because of these marked differences in longevity and
age-related phenotypes between such similar species, genomic
comparisons between them may provide clues about the genetic and
molecular basis of species differences in aging. Therefore, the main
goal of this project is the identification and characterization of
naked mole-rat genes that contributed to the evolution of a long
lifespan in this species.
I will discuss some of the computational work done in the Church Lab
at Harvard Medical School in collaboration with Dr. Yu Chuan Fei
(Church Lab) and Prof. Mathieu Blanchette. A rushed, but necessary,
introduction to molecular biology will be given.
In many applications of artificial intelligence, agents are often
asked to explore complex worlds. These agents usually rely on observations of
this worlds in
order to navigate through it. However, relying on these observations
brings out
some issues. One may encounter two identical states. This distinction
problems
hardens efficient exploration of the world. An exploration method that
takes in
account this problem will be elaborated
An important problem in the social sciences has been to find a socially
optimal way of sharing scarce goods. In the last decade, computer
scientists
started their own research into this century-old problem, motivated by
the
question of how to efficiently share bandwidth on a network like the
Internet. Current results hold only when consumers are competing for a
fixed
supply, such as a fixed network capacity. In my talk, I will present a
new
type of market that extends these results to the case when both consumers
and producers are competing over how bandwidth will be offered and
shared on
a network. I will prove that the social welfare in this market is always
within about 40% of optimal, and that this is the best that can be done
under some mild assumptions.
Physics-based modeling is appealing for animation as it
allows us to build
objects that follow the laws of physics and react as they would in the
real world.
However, while the basic principles and equations have been known
since the 17th
century, making use of them on a digital computer introduces some
difficulties.
Friction, while well understood in physics, introduces constraints
which can make
otherwise simple linear systems very difficult to solve. The use of a
Gauss-Seidel-like algorithm can allow us to obtain a solution, or an
arbitrary close
approximation of one, relatively easily.
The goal of this project is to develop an algorithm that can provide
approximate computations of bothproximity and interpenetration depth for
deformable models. Proximity queries are important for anticipating
contact constraints in a physical simulation, while interpenetration
depth
combined with penalty methods is one
of the easiest ways to deal with contact. Adaptively sampled distance
fields
are a convenient data structure for
both queries, but they must be recomputed if geometry changes shape.
Bounded
deformation sphere trees are a
convenient data structure for proximity queries between low rank
deformable
models, but do not provide
approximations of interpenetration depth. Other work has proposed the
interior distance field be deformed
using interpolation on the mechanical mesh into which the deformable
geometry is embedded, but this assumes
a physically based deformation. As such, this project will investigate
the
extension of adaptively sampled
distance fields to handle both inside and outside distance queries with
arbitrary deformable geometry and also
the extension to adaptively sampled gradient field to model invertible
space
warping functions and parametric
deformations. Results will be evaluated in applications that involve
resolution of contact in models undergoing
small deformation, for instance, finger pads that deform when grasping an
object,or an upper and lower lip that
make contact during different facial expressions.
Simulation of deformable objects is a critical problem in computer
animation. The Cauchy strain tensor is often used in a local
(corotational) reference frame to simplify force and stiffness
computations. Unfortunately, this stiffness warping can lead to
instability and incorrect forces which produce visible artifacts in a
simulation; the Green-Lagrange strain tensor is preferable but more
computationally expensive since it results in quadratic and cubic
polynomials in 24 variables. This project focuses on methods for
efficiently computing the Green-Lagrange forces and stiffness of
hexahedral elements.
A foray into 3D modelling on the iPhone. This project is a ground-up
exploration of 3D computer animation on the iPhone and iPod touch. As
such
this talk will touch broadly on a number of topics such as the general
techniques involved in computer animation, and the iphone itself as a
platform. This will lead into a more in-depth discussion of the specific
techniques used. For example linear interpolation and its applications
for
modeling deformable objects, finite element methods and how they
extend to
composite finite element methods, and the advantages and disadvantages of
thus modeling deformable objects. Finally I will discuss the various
pitfalls encountered and some general optimizations we incorporated to
bypass them.
An agent is designed to explore a simple virtual environment. In the
first phase of the project, a tree of perceptual schemas is built up
from given primitives to multiple levels of specificity of geometry
and colour. In the second phase, the agent is endowed with a working
memory, which it uses to compile a long-term memory of its
explorations. The agent develops a sense of the environment as space
(in contrast to its previous disconnected perceptions), and a
bottom-up predictive map of the space is created. In the third phase
of the project, the agent extrapolates from what it knows, answering
more complex questions about its environment.
To the extent its responses are correct, it is successful, but where
it is incorrect, (where the agent is pushed just past the limits of
its knowledge), it nonetheless suggests the idea of an imaginative
computational agent, whose "creative misunderstandings" offer
alternative points of view that may result in novel solutions to hard
problems.
The field of quantum computation combines the efforts of both computer science and
physics. The mathematics involved contains analysis and algebra; calculus and
combinatorics. I will motivate and introduce a recent theoretical tool: unitary
t-designs. Inspired by spherical designs, these combinatorial designs allow us to
replace integrals over polynomials in U(d) with finite sums. In particular, I will
discuss symmetries and how they allow us to expand t-designs. I will also present
lower bounds on the size of unitary t-designs. A (very) brief introduction to
quantum mechanics will be given.