Department of Computer Science and Automation Department of Computer Science and Automation, IISc, Bangalore, India Indian Institute of Science
HOME | ABOUT US | PEOPLE | RESEARCH | ACADEMICS | FACILITIES | EVENTS / SEMINARS | NEWS | CONTACT US


COURSE DESCRIPTIONS

Motivation and objectives of the course:

The design and implementation of scalable, reliable and secure software systems is critical for many modern applications. Numerous program analyses are designed to aid the programmer in building such systems and significant advances have been made in recent years. The objective of the course includes introduction of the practical issues associated with programming for modern applications, the algorithms underlying these analyses, and applicability of these approaches to large systems. There will be special emphasis on practical issues found in modern software. The course project will be geared towards building the programming skills required for implementing large software systems.

Syllabus:

The course will introduce the students to the following topics -- bytecode instrumentation; profiling -- BL profiling, profiling in the presence of loops, preferential path profiling, memory profiling; software bloat; lock-free data structures; memoization; map-reduce programming model; approximate computing; multithreading; fuzzing techniques; record and replay; memory models; data races -- lockset algorithm, happens-before relation, causally-precedes relation; atomicity violations; deadlocks; linearizability; symbolic execution; concolic testing; directed program synthesis; constraint solving; deterministic/stable multithreaded systems; floating-point problems; security -- sql-injection, cross-site scripting, return-oriented programming, obfuscation; malware detection.

References:

  • Course material available from the webpage; research papers

Prerequisites

  • Basic knowledge of programming in C/C++/Java.


Vector Spaces : Subspaces, Linear independence, Basis and dimension, orthogonality. Matrices : Solutions of linear equations, Gaussian elimination, Determinants, Eigenvalues and Eigenvectors, Characteristic polynomial, Minimal polynomial, Positive definite matrices and Canonical forms. Singular Value Decomposition, Applications.

References:

  • G Strang, Linear Algebra and Applications, Thomson-Brooks/Cole, 4th edition, 2006.


Vertex cover, matching, path cover, connectivity, hamiltonicity, edge colouring, vertex colouring, list colouring; Planarity, Perfect graphs; other special classes of graphs; Random graphs, Network flows, Introduction to Graph minor theory

References:

  • Reinhard Diestel, "Graph Theory", Springer (2010)
  • Douglas B. West, "Introduction to Graph Theory", Prentice Hall (2001)
  • A. Bondy and U. S. R. Murty, "Graph Theory", Springer (2008)
  • B. Bollabas, "Modern Graph Theory", Springer (1998)


Basic Mathematical Notions: Logic, Sets, Relations, Functions, Proofs; Abstract Orders: Partial Orders, Lattices, Boolean Algebra, Well Orders.;Counting & Combinatorics: Pigeonhole Principle, The Principle of Inclusion and Exclusion, Recurrence Relations, Permutations and Combinations, Binomial Coefficients and Identities; Number Theory: Mathematical Induction, Divisibility, The Greatest Common Divisor, The Euclidean Algorithm, Prime Numbers, integers, Fundamental Theorem of Arithmetic, Modular Arithmetic, Arithmetic with a Prime Modulus, Arithmetic with an Arbitrary Modulus, The RSA Algorithm; Groups and Fields: Basics, Isomorphism theorems, Chinese Remainder Theorem, Finite Fields; Graph Theory: Graph Terminology and Special Types of Graphs, Bipartite Graphs and Matching, Representation of Graphs, Connectivity, Euler and Hamilton Paths and Cycles, Planar Graphs, Graph Coloring, Trees.

References:

  • Laszlo Lovasz, Jozsef Pelikan, Katalin L. Vesztergombi: Discrete Mathematics, Springer 2003.
  • Graham,R.L., Knuth, D.E. and Patashnik, O: Concrete Mathematics: A Foundation for Computer Science, Addison-Wesley Professional; 2 edition, 1994.
  • Herstein I N : Topics in Algebra, 2 ed., Wiley India 1975.


Finite-state automata, including the Myhill-Nerode theorem, ultimate periodicity, and Buchi's logical characterization. Pushdown automata and Context-free languages, including deterministic PDA's, Parikh's theorem, and the Chomsky-Shutzenberger theorem. Turing machines and undecidability, including Rice's theorem and Godel's incompleteness theorem.

References:

  • Hopcroft J.E. and Ullman J.D.: Introduction to Automata, Languages and Computation. Addison Wesley, 1979.
  • Dexter Kozen: Automata and Computability. Springer 1999.
  • Wolfgang Thomas: Automata on infinite objects, in Handbook of Theoretical Computer Science, Volume B, Elsevier, 1990.


(1) Formal models of systems: labelled state transition diagrams for concurrent processes and protocols, timed and hybrid automata for embedded and real-time systems. (2) Specification logics: propositional and first-order logic; temporal logics (CTL, LTL, CTL*); fixpoint logic: mu-calculus. (3) Algorithmic analysis: model checking, data structures and algorithms for symbolic model checking, decision procedures for satisfiability and satisfiability modulo theories. (4) Tools: Student projects and assignments involving model checking and satisfiability tools e.g. zChaff, SPIN, NuSMV, Uppaal.

References:

  • Michael Huth, Mark Ryan: Logic in Computer Science: Modelling and Reasoning about Systems, Cambridge University Press, 2004.
  • Edmund M. Clarke, Orna Grumberg, Doron Peled: Model Checking, MIT Press, 2001.
  • Daniel Kroening, Ofer Strichman: Decision Procedures: An Algorithmic Point of View, Springer, 2008.

Prerequisites

  • Basics of data structures, algorithms, and automata theory.


Computational complexity theory is the fundamental subject of classifying computational problems based on their `complexities'. In this context, `complexity' of a problem is a measure of the amount of resource (time/space/random bits, or queries) used by the best possible algorithm that solves the problem. The aim of this course is to give a basic introduction to this field. Starting with the basic definitions and properties, we intend to cover some of the classical results and proof techniques of complexity theory.

Introduction to basic complexity classes; notion of `reductions' and `completeness'; time hierarchy theorem & Ladner's theorem; space bounded computation; polynomial time hierarchy; Boolean circuit complexity; complexity of randomized computation; interactive proofs; complexity of counting.

References:

  • The book titled `Computational Complexity - A Modern Approach' by Sanjeev Arora and Boaz Barak.
  • Lecture notes of similar courses as and when required.

Prerequisites

  • Basic familiarity with undergraduate level theory of computation and data structures & algorithms would be helpful.
  • More importantly, some mathematical maturity with an inclination towards theoretical computer science.


Review of basic data structures, searching, sorting. Algorithmic paradigms, e.g., greedy algorithms, divide and conquer strategies, dynamic programming. Advanced data structures. Graph algorithms. Geometric algorithms, Randomized algorithms. NP and NP-completeness.

References:

  • Jon Kleinberg and Éva Tardos, Algorithm Design, Addison Wesley, 2005.
  • Cormen, T.H., Leiserson, C.E., Rivest, R.L. and Stein C, Introduction to Algorithms, 2nd Edition, Prentice Hall, 2001.
  • Aho, A.V., Hopcraft J.E., and Ullman, J.D., Design and Analysis of Algorithms, Addison-Wesley, 1974.


Linear Algebra: System of Linear Equations, Vector Spaces, Linear Transformations, Matrices, Polynomials, Determinants, Elementary Canonical Forms, Inner Product Spaces, Orthogonality. Probability: Probability Spaces, Random Variables, Expectation and Moment generating functions, Inequalities, Some Special Distributions. Limits of sequence of random variables, Introduction to Statistics, Hypothesis testing.

References:

  • Gilbert Strang, Linear Algebra and its Applications, Thomson-Brooks/ Cole, 4th edition, 2006.
  • Hoffman and Kunze, Linear Algebra, Prentice Hall, 2nd edition, 1971.
  • Kishor S. Trivedi, Probability and Statistics with Reliability, Queueing, and Computer Science Applications, Wiley, 2nd edition, 2008.
  • Vijay K. Rohatgi, A. K. Md. Ehsanes Saleh, An Introduction to Probability and Statistics, Wiley, 2nd edition, 2000.
  • Kai Lai Chung, Farid Aitsahlia, Elementary Probability Theory, Springer, 4th edition, 2006.


Semantics of programs: denotational semantics, operational semantics, Hoare logic. Dataflow analysis: Computing join-over-all-paths information as the least solution to a set of equations that model the program statements, analysis of multi-procedure programs. Abstract interpretation of programs: Correctness of abstract interpretation, Galois connections, dataflow analysis as an abstract interpretation. Type inference: Hindley-Milner's type inference algorithm for functional programs, subset-based and unification-based type inference for imperative programs. Pointer analysis.

References:

  • Flemming Nielson, Hanne Riis Nielson, and Chris Hankin: Principles of Program Analysis, Springer, (Corrected 2nd printing, 452 pages, ISBN 3-540-65410-0), 2005.
  • Benjamic Pierce: Types and Programming Languages, Prentice-Hall India, 2002.
  • Research papers


Basic combinatorial numbers, selection with repetition, pigeon hole principle, Inclusion-Exclusion Principle, Double counting; Recurrence Relations, Generating functions; Special combinatorial numbers: Sterling numbers of the first and second kind, Catalan numbers, Partition numbers; Introduction to Ramsey theory; Combinatorial designs, Latin squares; Introduction to Probabilistic methods, Introduction to Linear algebra methods.

References:

  • R. P. Grimaldi, B. V. Ramana, "Discrete and Combinatorial Mathematics: An applied introduction", Pearson Education (2007)
  • Richard A Brualdi, "Introductory Combinatorics", Pearson Education, Inc. (2004)
  • Miklos Bona, "Introduction to Enumerative Combinatorics", Mc Graw Hill (2007)
  • Miklos Bona, "A walk through Combinatorics: An introduction to enumeration and graph theory", World Scientific Publishing Co. Pvt. Ltd. (2006)
  • J. H. Vanlint, R. M. Wilson, "A course in Combinatorics", Cambridge University Press (1992, 2001)
  • Stasys Jukna, "Extremal Combinatorics: With applications in computer science", Springer-Verlag (2001)
  • Noga Alon, Joel H. Spencer, P. Erdos, "The Probabilistic methods", Wiley Interscience Publication
  • Laszlo Babai and Peter Frankl, "Linear Algebra Methods in Combinatorics, with Applications to Geometry and Computer Science" (Unpublished Manuscript, 1992)

Prerequisites

  • None. (A very basic familiarity with probability theory and linear algebra is preferred, but not a must. The required concepts will be introduced quickly in the course.)


High Dimensional Geometry,SVD and applications,Random Graphs,Markov Chains, Algorithms in Machine Learning, Clustering,Massive data and Sampling on the fly

References:

  • Foundations of Data Science by Hopcroft and Kannan

Prerequisites

  • Basic Linear Algebra, Basic Probability.


Need for unconstrained methods in solving constrained problems. Necessary conditions of unconstrained optimization, Structure of methods, quadratic models. Methods of line search, Armijo-Goldstein and Wolfe conditions for partial line search. Global convergence theorem, Steepest descent method. Quasi-Newton methods: DFP, BFGS, Broyden family. Conjugate-direction methods: Fletcher-Reeves, Polak-Ribierre. Derivative-free methods: finite differencing. Restricted step methods. Methods for sums of squares and nonlinear equations. Linear and Quadratic Programming. Duality in optimization.

References:

  • Fletcher R., Practical Methods of Optimization, John Wiley, 2000.


Basic algebraic notions: Integers, Euclidean algorithm, division algorithm, ring and polynomial rings, abstract orders and Dickson’s lemma; Introduction to Gröbner bases: Term orders, multivariate division algorithm, Hilbert basis theorem, Gröbner bases and Buchberger algorithm, computation of syzygies, basic algorithms in ideal theory, universal Gröbner bases; Algebraic Applications: Hilbert nullstellensatz, implicitization, decomposition, radical and zeros of ideals; Other applications: Toric ideals and integer programming, applications to graph theory, coding, cryptography, statistics.

References:

  • Ideals, Varieties and Algorithms by D. Cox and O’Shea, Springer; 2nd ed. 1997.
  • Algorithmic Algebra by Bhubaneswar Mishra, Springer, 1993.


Probability spaces and continuity of probability measures, random variables and expectation, moment inequalities, multivariate random variables, sequence of random variables and different modes of convergence, law of large numbers, Markov chains, statistical hypothesis testing, exponential models, introduction to large deviations.

References:

  • An Introduction to Probability and Statistics by Vijay K. Rohatgi, A. K. Md. Ehsanes Saleh, Wiley, 2nd edition 2000.
  • An Intermediate course in Probability, by Allen Gut, Springer, 2008.


Data compression and Kraft's inequality, source coding theorem and Shannon entropy, Kullback-Leibler divergence and maximum entropy, I-projections and Sanov theorem, Kullback- siszar iteration and iterative scaling algorithms, Fisher information and Cramer-Rao inequality, quantization and introduction to rate distortion theory, generalized information measures and power-law distributions.

References:

  • Elements of Information Theory, by T. M. Cover and J. A. Thomas, John Wiley & Sons, 2nd edition, 2006.
  • Information Theory, Inference, and Learning Algorithms by D.J.C. MacKay, Cambridge University Press, 2003.


The use of randomness in algorithm design is an extremely powerful paradigm. Often, it makes algorithm design (and analysis) easier; however there are some problems for which we only know randomized algorithms and no deterministic algorithms. Furthermore, depending on the model of computation, randomization is often essential -- it provably does better than all deterministic algorithms. In this course, we will introduce the basic techniques of designing randomized algorithms although at times we will dive into state-of-the-art topics. Students are expected to have taken an introductory course in algorithm design and analysis, and some familiarity with probability, although not essential, is desirable.

References:

  • "Randomized Algorithms" by Motwani and Raghavan


Elementary number theory, Finite fields, Arithmetic and algebraic algorithms, Secret key and public key cryptography, Pseudo random bit generators, Block and stream ciphers, Hash functions and message digests, Public key encryption, Probabilistic encryption, Authentication, Digital signatures, Zero knowledge interactive protocols, Elliptic curve cryptosystems, Formal verification, Cryptanalysis, Hard problems.

References:

  • Stinson. D. Cryptography: Theory and Practice.
  • Menezes. A. et. al. Handbook of Applied Cryptography


Information retrieval using the Boolean model. The dictionary and postings lists. Tolerant retrieval. Index construction and compression. Vector space model and term weighting. Evaluation in information retrieval. Relevance feedback and query expansion. Probabilistic information retrieval. Language models for information retrieval. Text classification and clustering. Latent semantic indexing. Web search basics. Web crawling and indexes. Link analysis.

References:

  • C. D. Manning, P. Raghavan, and H. Schutze, Introduction to Information Retrieval, Cambridge University Press, 2008.
  • Recent Literature


Concepts of Agency and Intelligent Agents. Action of Agents, Percepts to Actions. Structure of Intelligent Agents, Agent Environments, Communicating, Perceiving, and Acting. Concepts of Distributed AI, Cooperation, and Negotiation. Applications: Web-based Agents, Database Applications. Agent Programming

References:

  • S. Russel and P. Norvig, Artificial Intelligence - A Modern Approach, Prentice Hall, 1995.
  • Recent Papers


Introduction to Artificial Intelligence, Problem solving, knowledge and reasoning, Logic, Inference, Knowledge based systems, reasoning with uncertain information, Planning and making decisions, Learning, Distributed AI, Communication, Web based agents, Negotiating agents, Artificial Intelligence Applications and Programming.

References:

  • S. Russel and P. Norvig, Artificial Intelligence - A Modern Approach, Prentice Hall, 1995.
  • George F. Luger, Artificial Intelligence, Pearson Education, 2001.
  • Nils J. Nilsson, Artificial Intelligence - A New Synthesis, Morgan Kaufmann Publishers, 2000


Models of concurrency: multi-threading, synchronization, event-based dispatch. Model checking: model checking abstractions, context bounding, partial order reduction. Static analysis: type systems for proving dealock and race freedom, rely guarantee framework for compositional reasoning. Security vulnerabilities/attacks: attacks targeting spatial and temporal memory safety violations, injection and scripting attacks. Vulnerability detection: overflow, heap, and string analyses; information flow.

References:

  • M. Ben-Ari, "Principles of concurrent and distributed programming", Addison-Wesley, 2006
  • "Handbook of model checking", Springer, 2014
  • Brian Chess and Jacob West, "Secure programming with static analysis", Addison Wesley, 2007
  • Additional research papers.


Introduction to Probability theory, Random variables, commonly used continuous and discrete distributions. Introduction to Stochastic Process, Poisson process, Markov chains, steady stateand transient analysis. Psuedo random numbers: Methods of Generation and testing. Methods for generating continuous and discrete distributions. Methods for generating Poisson Process. Building blocks of Simulation, Data Structures and Algorithms. Introduction to Probabilistic modelling, Maximum Likelihood Variance reduction techniques: antithetic variates, control variates, common random numbers, importance sampling. Analysis of Simulation results: confidence intervals, design of experiments. Markov Chain Monte Carlo techniques.

References:

  • Sheldon M. Ross: Introduction to Probability Models 7th Edition, Academic Press, 2002
  • Donald E. Knuth: The Art of Computer Programming - Volume 2: Semi Numerical Algorithms, 2nd Edition, Addison Wesley, Reading MA, USA 2000
  • Sheldon M. Ross Simulation 3rd Edition, Academic Press, 2002
  • A. M. Law and W. D. Kelton: Simulation Modeling and Analysis, 3rd Edition, McGrawHill, New York, USA, 1998
  • Raj Jain The Art of Computer Systems Performance Analysis, John Wiley and Sons, New York, USA, 1991


Introduction to computer networks; telephone networks, networking principles; switching - circuit switching, packet switching; scheduling - performance bounds, best effort disciplines, naming and addressing, protocol stack, SONET/SDH; ATM networks - AAL, virtual circuits, SSCOP; Internet - addressing, routing, end point control; Internet protocols - IP, TCP, UDP, ICMP, HTTP; performance analysis of networks - discrete and continuous time Markov chains, birth-death processes, time reversibility, queueing / delay models - M/M/1, M/M/m, M/M/m/m, M/G/1 queues, infinite server systems; open and closed queueing networks, Jackson's theorem, Little's law; traffic management - models, classes, scheduling; routing algorithms - Bellman Ford and Dijkstra's algorithms; multiple access, frequency and time division multiplexing; local area networks - Ethernet, token ring, FDDI, CSMA/CD, Aloha; control of networks - QoS, window and rate congestion control, open and closed loop flow control, large deviations of a queue and network, control of ATM networks.

References:

  • I. Mitrani, Modelling of Computer and Communication Systems, Cambridge, 1987.
  • J.Walrand and P.Varaiya, High Performance Communication Networks, Harcourt Asia (Morgan Kaufmann), 2000.
  • S.Keshav, An Engineering Approach to Computer Networking, Pearson Education, 1997.
  • D.Bertsekas and R.Gallager, Data Networks, Prentice Hall of India, 1999.
  • J.F.Kurose and K.W.Ross, Computer Networking: A Top-Down Approach Featuring the Internet, Pearson Education, 2001.


Review of Probability theory: Random variables, Expectation, Central Limit theorem. Latent variable models: mixture models, HIdden Markov models, EM algorithm, Graphical models: Algorithms for Inference, Markov Chain Monte Carlo Methods, Belief Propagation, Variational methods Factor Analysis. Applications to Text: Maxent Formalism, Statistical Parsing, CKY algorithm, Topic models.

References:

  • Sheldon Ross - Introduction to Probability theory
  • C. Bishop - Pattern Recognition
  • J. Pearl: Probabilistic Reasoning in Intelligent Systems: Networks of Plausible Inference
  • C. Manning and H. Schutzle - Foundations of Statistical Natural Language Processing


Processor architecture, pipelining, vector processing, superscalar processors, hardware and compiler support for branch prediction, out-of-order Instruction issue, speculative execution and other techniques for high-performance, Instruction and data cache organizations, multilevel caches, parallel memory systems, Support for virtual memory, Multiple processor systems, taxonomy, programming models, message passing systems, Interconnection networks, shared memory system, memory models, cache coherence, I/O systems, parallel disk organisations, Introduction to advanced topics.

References:

  • Hennessy, J.L., and Patterson, D.A.: Computer Architecture, A quantitative Approach, Morgan Kaufmann.
  • Stone, H.S.: High-Performance Computer Architecture, Addison-Wesley.
  • Current literature


Voronoi diagram, Delaunay triangulation, Geometric Data Structures — Interval tree, Range tree, Segment tree. Complexes — simplicial complex, Rips complex, alpha complex, homology, Betti numbers, persistence homology, Morse functions, Reeb graph, approximation and fixed parameter algorithms for geometric problems - hitting set and set cover, epsilon nets, epsilon approximations, geometric intersection graphs, geometric discrepancy, clustering.

References:

  • Computational Topology : An Introduction, Herbert Edelsbrunner and John L. Harer, American Mathematical Society, Indian Edition, 2010.
  • Computational Geometry: Algorithms and Applications, Mark de Berg, Otfried Cheong, Marc van Kreveld, and Mark Overmars, Third Edition, Springer (SIE), 2011.
  • Geometric Approximation Algorithms, Sariel Har-Peled, American Mathematical Society, Indian Edition, 2013.

Prerequisites

  • E0225 : Design and Analysis of Algorithms


This course is a complexity-theoretic introduction to Cryptography. Emphasis will be placed on exploring connections between various fundamental cryptographic primitives via reductions.

Some of the primitives we will cover are one-way functions, pseudo-random generators, pseudo-random functions, trapdoor permutations, encryption, digital signatures, hash functions, commitments. We will also try to cover some special topics (private information retrieval, zero-knowledge proofs, oblivious transfer etc.).


Greedy algorithms; Local search; Linear programs (relaxations and rounding); Iterated rounding; Primal dual algorithms; Randomized rounding; Semidefinite programming; Sparsest cut and metric embeddings; Label cover; Unique games.

References:

  • "The Design of Approximation Algorithms" by David Shmoys and David Williamson".
  • Approximation Algorithms and Semidefinite Programming" by Bernd GÃrtner and Jiri Matousek - Research papers.

Prerequisites

  • E0225: Design and Analysis of Algorithms.


Abstract data types and data structures, Classes and objects, Complexity of algorithms: worst case, average case, and amoritized complexity. Algorithm analysis. Algorithm Design Paradigms. Lists: stacks, queues, implementation, garbage collection. Dictionaries: Hash tables, Binary search trees, AVL trees, Red-Black trees, Splay trees, Skip-lists, B-Trees. Priority queues. Graphs: Shortest path algorithms, minimal spanning tree algorithms, depth-first and breadth-first search. Sorting: Advanced sorting methods and their analysis, lower bound on complexity, order statistics.

References:

  • A.V. Aho, J.E. Hopcroft, and J.D. Ullman, Data Structures and Algorithms, Addison Wesley, Reading Massachusetts, USA, 1983
  • T.H. Cormen, C.E. Leiserson, and R.L. Rivest, Introduction to Algorithms, The MIT Press, Cambridge, Massachusetts, USA, 1990
  • M.A. Weiss, Data Structures and Algorithms Analysis in C++, Benjamin/Cummins, Redwood City, California, USA, 1994.


Example languages from each of the above categories would be discussed along with their implementation details. Formal semantics would be used to enhance the understanding of the features and to assist in the design of correct implementations. However, there will be no deep discussion of the theory. This is neither a course on compiler design nor a course on the theory of programming languages. Emphasis would be on understanding the features and their implementation. Students will be required to carry out mini projects as a part of the course.

Features and implementation of imperative, object-oriented, concurrent, distributed, logic-programming, functional, aspect-oriented, scripting, business-oriented and web programming languages.

References:

  • Robert Harper, Practical Foundations for Programming Languages, Cambridge University Press, 2012.
  • John Mitchell, Concepts in Programming Languages, Cambridge University Press, 2002.
  • John Reynolds, Theories of Programming Languages, Cambridge University Press, 2009.
  • Selected papers

Prerequisites

  • None. However, programming in C/C++/Java/shell/Perl and a course on compiler design at the BE/BTech level would be helpful. There will be no overlap with the compiler design course in the CSA department (E0 255).


User Level Specification of OS. Fundamental Concepts of Multiprogrammed OS, Basic Concepts and Techniques for Implementation of Multiprogrammed OS. Processes and the Kernel, Microkernel Architecture of OS. Multiprocessor, Multimedia, and Real-Time OS. POSIX Standards. Management and Control of Processes. Basic Concept of Threads, Types of Threads, Models of Thread Implementations. Traditional and Real-Time Signals. Clocks, Timers and Callouts. Thread Scheduling for Unix, Windows, and Real-Time OS, Real-Time Scheduling. Interprocess/Interthread Synchronization and Communication, Mutual Exclusion/Critical Section Problem, Semaphores, Monitors, Mailbox, Deadlocks. Concepts and Implementation of Virtual Memory(32-bit and 64-bit), Physical Memory Management. File Organization, File System Interface and Virtual File Systems, Implementation of File Systems. I/O Software:Interrupt Service Routines and Device Drivers. Protection and Security. Case Study of Unix, Windows, and Real-Time OS.

References:

  • Andrew S. Tanenbaum: Modern Operating Systems, Second Edition, Pearson Education, Inc., 2001.
  • Uresh Vahalia: UNIX Internals: The New Frontiers, Prentice-Hall, 1996.
  • J. Mauro and R. McDougall: Solaris Internals: Core Kernel Architecture, Sun Microsystems Press, 2001.
  • Daniel P. Bovet and Marco Cesati: Understanding the Linux kernel, 2nd Edition O'Reilly & Associates, Inc., 2003.


Security Goals and Violations; Security Requirements; Security Services; Discrete Logs, Encryption/Decryption Functions, Hash Functions, MAC Functions; Requirements and Algorithmic Implementation of One-Way Functions; OS Security Violations and Techniques to Prevent Them; Access Control Models; Secure Programming Techniques; Authenticated Diffie-Hellman Key Establishment Protocols; Group Key Establishment Protocols; Block Ciphers and Stream Ciphers; Modes of Encryption; Digital Signatures; Authentication Protocols; Nonce and Timestamps; PKI and X.509 Authentication Service; BAN logic; Kerberos; E-mail Security; IP Security; Secure Socket Layer and Transport Layer Security; Secure Electronic Transactions; Intrusion Detection; Malicious Software Detection; Firewalls.

References:

  • William Stallings: Cryptography and Network Security: Principles and Practices, Fourth Edition, Prentice Hall, 2006.
  • Neil Daswani, Christoph Kern and Anita Kesavan: Foundations of Security: What Every Programmer Needs to Know, Published by Apress, 2007.
  • Yang Xiao and Yi Pan: Security in Distributed and Networking Systems, World Scientific, 2007.
  • Current Literature.

Prerequisites

  • Knowledge of Java is desirable, but not necessary.


Control flow graphs and analysis; Dataflow analysis; Static single assignment (SSA); Compiler optimizations; Dependence analysis, Loop optimizations and transformations, Parallelization, Optimizations for cache locality, and Vectorization; Domain-specific languages, compilation, and optimization; Register allocation, Instruction scheduling; Run time environment and storage management; Impact of language design and architecture evolution on compilers.

References:

  • Aho, A.V., Ravi Sethi and J.D. Ullman: Compilers - Principles, Techniques and Tools, Addison Wesley, 1988.
  • S. Muchnick: Advanced Compiler Design and Implementation, Morgan Kauffman, 1998
  • Selected Papers.


Software process and the role of modeling and analysis, software architecture, and software design. Software Modeling and Analysis: analysis modeling and best practices, traditional best practice diagrams such as DFDs and ERDs, UML diagrams and UML analysis modeling, analysis case studies, analysis tools, analysis patterns. Software Architecture: architectural styles, architectural patterns, analysis of architectures, formal descriptions of software architectures, architectural description languages and tools, scalability and interoperability issues, web application architectures, case studies. Software Design: design best practices, design patterns, extreme programming, design case studies, component technology, object oriented frameworks, distributed objects, object request brokers, case studies.

References:

  • Booch,G., Rumbaugh, J., Jacobson, I., The Unified Modeling Language User Guide, Addison- Wesley, 1999.
  • Gamma, E.,Helm, R. Johnson, R. Vissides, J., Design Patterns, Elements of Reusable Object- Oriented Software, Addison-Wesley, 1995.
  • Frank Buschmann et al. Pattern Oriented Software Architecture, Volume 1: A System of Patterns. John Wiley and Sons, 1996.
  • Shaw, M., and Garlan, D., Software Architecture: Perspectives on an Emerging Discipline, Prentice-Hall, 1996.
  • Len Bass et al. Software Architecture in Practice. Addison Wesley, 1998.


Survey of programming paradigms and computational models for program execution. Programming language examples, syntax description and language semantics Functional programming, lamda calculus, Higher-order functions, currying, recursion. Imperative programming and control structures, invariants, object models, messages, and method dispatch, inheritance, subtypes and subclasses, polymorphism, covariance, and contravariance. Formal aspects of Java. Concurrent programming models and constructs, programming in the multi-core environment. Introduction to Logic programming quantifiers, first order logic, Horn clauses, unification and resolution.

References:

  • Daniel Friedman, Mitchel Wand and Christopher Hanes. "Essentials of Programming Langauges", Prentice Hall of India, 2nd Edition, 2001.
  • John Reynolds, "Theories of Programming Languages", Cambridge Univ. Press, 1998.
  • John Mitchell, "Concepts in Programming Languages", Cambridge Univ. Press, 2003.
  • Benjamin Pierce, "Types and and Programming Languages", MIT Press, 2002.
  • Selected Chapters from J. an Leeuwen, Ed. "Handbook of Theoretical Computer Science", Vol. B, Elsevier, MIT Press, 1994.
  • Kim Bruce, "Foundations of Object Oriented Languages", Prentice Hall of India, 2002.
  • Martin Abadi and Luca Cadelli, "A Theory of Objects", Springer, 1996.
  • Current research papers and Internet resources


Data Analytics is assuming increasing importance in recent times. Several industries are now built around the use of data for decision making. Several research areas too, genomics and neuroscience being notable examples, are increasingly focused on large-scale data generation rather than small-scale experimentation to generate initial hypotheses. This brings about a need for data analytics. This course will develop modern statistical tools and modelling techniques through hands-on data analysis in a variety of application domains. The course will illustrate the principles of hands-on data analytics through several case studies (8-10 such studies). On each topic, we will introduce a scientific question and discuss why it should be addressed. Next, we will present the available data, how it was collected, etc. We will then discuss models, provide analyses, and finally touch upon how to address the scientific question using the analyses.Data sets from astronomy, genomics, visual neuroscience, sports, speech recognition, computational linguistics and social networks will be analysed in this course.Statistical tools and modeling techniques will be introduced as needed to analyse the data and eventually address these scientific questions. There will be a few guest lectures from industry also.

References:

  • Random Processes (E2 202) or Probability and Statistics (E0 232) or equivalent.

Prerequisites


Design of Database Kernels, Query Optimization (Rewriting Techniques, Access Methods, Join Algorithms, Plan Evaluation), Transaction Management (ARIES), Distributed Databases (Query Processing and Optimization, Concurrency Control, Commit Protocols), Object-Relational Databases (Motivation, Design and Implementation), Spatial Databases (Storage, Indexing Techniques, Query Optimization), Data Mining (Association, Classification and Sequence Rules, Integration with Database Engines), Data Warehousing (Star and Snowflake Schemas, Data Cubes, View Maintenance), Semistructured and Web Databases (Data Models, Query Systems, XML, XML-Schema, Relational Storage, Compression), Mobile Databases (Broadcast Disks, Indexing Techniques), Applications to E-commerce.

References:

  • Fundamentals of Database Systems R. Elmasri and S. B. Navathe, Addison-Wesley, 3rd ed., 1999.
  • Database Management Systems R. Ramakrishnan and J. Gehrke, McGraw-Hill, 2nd ed., 1999.
  • Readings in Database Systems M. Stonebraker and J. Hellerstein, Morgan Kaufmann, 3rd ed., 1998.
  • Object-Relational DBMSs M. Stonebraker, Morgan Kaufmann, 1996 .
  • Data Warehousing (Strategies, Technologies and Techniques) R. Mattison, IEEE Press, 1998.
  • Data Mining R. Groth, Prentice Hall, 1998.
  • Recent Conference and Journal papers.

Prerequisites

  • Data Structures, C or C++, Undergraduate course in DBMS


Fundamental Issues in Distributed Systems, Distributed System Models and Architectures; Classification of Failures in Distributed Systems, Basic Techniques for Handling Faults in Distributed Systems; Logical Clocks and Virtual Time; Physical Clocks and Clock Synchronization Algorithms; Security Issues in Clock Synchronization; Secure RPC and Group Communication; Group Membership Protocols and Security Issues in Group Membership Problems; Naming Service and Security Issues in Naming Service; Distributed Mutual Exclusion and Coordination Algorithms; Leader Election; Global State, Termination and Distributed Deadlock Detection Algorithms; Distributed Scheduling and Load Balancing; Distributed File Systems and Distributed Shared Memory; Secure Distributed File Systems; Distributed Commit and Recovery Protocols; Security Issues in Commit Protocols; Checkpointing and Recovery Protocols; Secure Checkpointing; Fault-Tolerant Systems, Tolerating Crash and Omission Failures; Implications of Security Issues in Distributed Consensus and Agreement Protocols; Replicated Data Management; Self-Stabilizing Systems; Design Issues in Specialized Distributed Systems.

References:

  • Randy Chow, and Theodore Johnson, "Distributed Operating Systems and Algorithms", Addison-Wesley, 1997.
  • Sukumar Ghosh, "Distributed Systems: An Algorithmic Approach", CRC Press, 2006.
  • Kenneth P. Birman, "Reliable Distributed Systems: Technologies, Web Services, and Applications", Springer New York, 2005.
  • G. Coulouris, J. Dollimore, and T. Kindberg, "Distributed Systems: Concepts and Designs", Fourth Edition, Pearson Education Ltd., 2005.
  • Current Literature

Prerequisites

  • NDSS(E0 254) or equivalent course


Introduction, Data Preparation, Linear Methods for Classification and Regression, Additive Models and Tree based methods, Support Vector Machines, Model Assessment and Selection, Unsupervised Learning, Link Analysis, Recommendation Systems and Handling Large Datasets: MapReduce.

References:

  • James, Witten, Hastie and Tibshirani, An Introduction to Statistical Learning with Applications in R, Springer, 2015
  • Rajaraman, Leskovec and Ullman, Mining of Massive Datasets, Cambridge University Press, 2014
  • Hastie, Tibshirani and Friedman, The Elements of Statistical Learning, Springer, 2009
  • Recent literature

Prerequisites

  • Linear Algebra, Probability and Statistics, Some programming experience in any language.


Graph types : conditional independence; directed, undirected, and actor models; algorithms for conditional independence (e.g., Bayes-ball,d-separation, Markov properties on graphs, factorization,Hammersley-Clifford theorems). Static Models : linear Gaussian models, mixture models, factor analysis, probabilistic decision trees, Markov Random Fields, Gibbs distributions, static conditional random fields (CRFs), multivariate Gaussians as graphical models, Exponential family, generalized linear models, factored exponential families. Dynamic (temporal) models : Hidden Markov Models, Kalman filtering and linear-Gaussian HMMs, linear dynamic models, dynamic Bayesian networks (DBNs), label and observation bias in natural language processing, dynamic conditional random fields (CRFs), and general dynamic graphical models. Chordal Graph Theory : moralization; triangulated, decomposable, and intersection graphs, Tree-width and path-width parameters of a graph. Exact Probabilistic Inference : The elimination family of algorithms. Relation to dynamic programming. Generality (such as Viterbi, MPE, the fast Fourier transform). junction trees, belief propagation, optimal triangulations. NP hardness results. Approximate Probabilistic Inference : loopy belief propagation (BP), expectation propagation (EP), Sampling (markov chains, metropolis hastings, gibbs, convergence and implementaional issues) particle filtering. Structure Learning : Chow Liu algorithm. Latent Dirichlet Allocation (1 wk): Exchangeability, de Finetti Theorem, Inference using collapsed Gibbs sampling, Dirichlet compound multinomial model.

References:

  • Probabilistic Graphical Models: Principles and Techniques", Daphne Koller and Nir Friedman
  • Relevant papers

Prerequisites

  • Introduction to Probability and Statistics and Consent of Instructor


Introduction to machine learning. Classification: nearest neighbour, decision trees, perceptron, support vector machines, VC-dimension. Regression: linear least squares regression, support vector regression. Additional learning problems: multiclass classification, ordinal regression, ranking. Ensemble methods: boosting. Probabilistic models: classification, regression, mixture models (unconditional and conditional), parameter estimation, EM algorithm. Beyond IID, directed graphical models: hidden Markov models, Bayesian networks. Beyond IID, undirected graphical models: Markov random fields, conditional random fields. Learning and inference in Bayesian networks and MRFs: parameter estimation, exact inference (variable elimination, belief propagation), approximate inference (loopy belief propagation, sampling). Additional topics: semi-supervised learning, active learning, structured prediction.

References:

  • Bishop. C M, Pattern Recognition and Machine Learning. Springer, 2006.
  • Duda, R O, Hart P E and Stork D G. Pattern Classification. Wiley-Interscience, 2nd Edition, 2000.
  • Hastie T, Tibshirani R and Friedman J, The Elements of Statistical Learning: Data Mining, Inference and Prediction. Springer, 2nd Edition, 2009.
  • Mitchell T, Machine Learning. McGraw Hill, 1997.
  • Current literature.

Prerequisites

  • Probability and Statistics (or equivalent course elsewhere). Some background in linear algebra and optimization will be helpful.


Principles of computer graphics; graphics pipeline; graphics hardware; transformations; viewing; lighting; shading; modeling; selected topics in meshing, subdivision techniques, multi-resolution methods, visualization, ray tracing; individual projects.

References:

  • Edward S. Angel. Interactive Computer Graphics, A top-down approach with OpenGL. Addison-Wesley, 2005.
  • OpenGL Architecture Review Board, Dave Shreiner, Mason Woo, Jackie Neider, and Tom Davis.
  • OpenGL Programming Guide: The Official Guide to Learning OpenGL. Addison-Wesley, 2005.
  • Donald Hearn and M. Pauline Baker. Computer Graphics with OpenGL. Prentice Hall, 2003.

Prerequisites

  • Courses in linear algebra, data structures, algorithms, and programming.


Domain modeling using first-order predicate logic and relational calculus -- the tools Alloy and Event-B. Verification of finite-state systems, and concurrent systems -- Sal and Spin. Code development using refactoring -- Eclipse Refactorings. Identifying errors in code during development using dataflow analysis and logical reasoning -- FindBugs and SpecSharp. Testing and bounded-exploration of applications -- Pex.

References:

  • Logic in Computer Science: Modelling and Reasoning about Systems, by Michael Huth and Mark Ryan.
  • Software Abstractions: Logic, Language, and Analysis, by Daniel Jackson.
  • Model Checking, by Edmund M. Clarke, Orna Grumberg, and Doron Peled.
  • Specifying software: A Hands-On Introduction, by R. D. Tennent.
  • Research papers.


High-Dimensional Data. Modeling of data in a high dimensional space. High dimensional Euclidean geometry. Random projection theorem. Random Graphs. Erdos-Renyi model, Properties of random graphs, Giant component and threshold phenomena. Random satisfiability problems. Singular Value Decomposition and its applications. Random Walks: Connections to electrical networks, convergence using eigen values and conductance measures. Foundations of Learning Theory. The perceptron algorithm, margins and support vector machines and Vapnik-Chervonenkis theorem and applications. Clustering algorithms and criteria. Provable results and algorithms for k-means and other criteria. Recent work on finding local communities using random walks. Massive Data Computations including streaming algorithms. Fast approximations to matrices such as the CUR approximation. Games and Economic related models and algorithms, the notion of equilibrium, its existence and computation, markets and market equilibria.

References:

  • John Hopcroft and Ravi Kannan. Mathematics for Modern Computing. Forthcoming chapters will be made available.
  • Ravi Kannan and Santosh Vempala. Spectral Algorithms, Now Publishers, 2009.


1. Introduction, Motivation: Application Domains of Geographical Information Systems (GIS), Common GIS data types and analysis, OGC standards and reference geometry model 2. Models of Spatial Data : Conceptual Data Models for spatial databases (e.g. pictogram enhanced ERDs). Logical data models for spatial databases: raster model (map algebra), vector model (OGIS/SQL1999) 3. Spatial query languages : Need for spatial operators and relations, SQL3 and ADT. Spatial operators, OGIS queries. 4. Spatial storage Methods : Clustering methods (space filling curves), Storage methods (R-tree, Grid files), Concurrency control (R-link trees), Compression methods for raster and vector data, Spatial indexing 5. Spatio-temporal and moving object databases : Spatio Bitemporal objects and operations. Querying, Event models. Spatio temporal indexes 6. Processing spatial queries : Spatial selection, joins, aggregates, nested queries, buffers 7. Query processing and optimization : strategies for range query, nearest neighbor query, spatial joins (e.g. tree matching), cost models for new strategies, impact on rule based optimization, selectivity estimation 8. Spatial networks : Road network databases and connectivity graphs. Topology storage, query for spatial networks. 9. Mining spatial databases : Clustering, Spatial classification, Co-location patterns, Spatial outliers, 10. Geosensor databases

References:

  • Spatial Databases: A Tour, S. Shekhar and S. Chawla, Prentice Hall, 2003
  • Moving Objects Databases, by Ralf Hartmut Guting, Markus SchneiderMorgan kaufman, 2005
  • Spatial Databases with Applications to GIS, P. Rigaux, M. Scholl, A. Voisard, Morgan Kaufmann, 2002
  • Spatio-Temporal Database, M. Koubarakis, T. Selles at al (ed.), Springer 2003
  • Selected papers (see the bibliography available at: http://www.spatial.cs.umn.edu/Courses/Fall07/8715/paperList.html)


This course will cover topics on developing applications on mobile smartphone platforms. Primary emphasis will be on Android development, while students will also learn the basics of developing applications for iOS. The course will include a project that will be defined and executed by student groups.

References:

  • The Android and iOS developer documentation.
  • Lecture notes handed out in class.
  • Papers from recent conferences and journals.

Prerequisites

  • Programming skills.


Reinforcement learning is a paradigm that aims to model the trial-and-error learning process that is needed in many problem situations where explicit instructive signals are not available. It has roots in operations research, behavioral psychology and AI. The goal of the course is to introduce the basic mathematical foundations of reinforcement learning, as well as highlight some of the recent directions of research. The Reinforcement Learning problem: evaluative feedback, non-associative learning, Rewards and returns, Markov Decision Processes, Value functions, optimality and approximation. Bandit Problems: Explore-exploit dilemma, Binary Bandits, Learning automata, exploration schemes. Dynamic programming: value iteration, policy iteration, asynchronous DP, generalized policy iteration. Monte-Carlo methods: policy evaluation, roll outs, on policy and off policy learning, importance sampling. Temporal Difference learning: TD prediction, Optimality of TD(0), SARSA, Q-learning, R-learning, Games and after states. Eligibility traces: n-step TD prediction, TD(lambda), forward and backward views, Q(lambda), SARSA(lambda), replacing traces and accumulating traces. Function Approximation: Value prediction, gradient descent methods, linear function approximation, Control algorithms, Fitted Iterative Methods. Policy Gradient methods: non-associative learning - REINFORCE algorithm, exact gradient methods, estimating gradients, approximate policy gradient algorithms, actor-critic methods. Hierarchical RL: MAXQ framework, Options framework, HAM framework, Option discovery algorithms. Case studies: Elevator dispatching, Samuel's checker player, TD-gammon, Acrobot, Helicopter piloting, Computational Neuroscience.

References:

  • R. S. Sutton and A. G. Barto. Reinforcement Learning - An Introduction. MIT Press. 1998.
  • D. P. Bertsikas and J. N. Tsitsiklis. Neuro-dynamic programming. Athena Scientific. 1996.
  • K. S. Narendra and M. A. L. Thathachar. Learning Automata - An Introduction. Prentice-Hall, USA. 1989.
  • A. G. Barto and S. Mahadevan, Recent Advances in Hierarchical Reinforcement Learning, Discrete Event Systems Journal, Volume 13, Special Issue on Reinforcement Learning, pp. 41-77. 2003.
  • R. J. Williams, Simple Statistical Gradient-following algorithms for Connectionist Reinforcement Learning, Machine Learning, 8:229-256. 1992.
  • J. Baxter, P. L. Bartlett, Infinite-Horizon Gradient-Based Policy Search, Journal of Artificial Intelligence Research, 15: 319-350. 2001.
  • V. R. Konda and J. N. Tsitsiklis, "Actor-Critic Algorithms", SIAM Journal on Control and Optimization, Vol. 42, No. 4, pp. 1143-1166. 2003.

Prerequisites

  • Basic probability theory and linear algebra, Familiarity with regression and non-linear optimization.


Objectives:To provide a graduate level understanding of the concepts, hardware and software systems, and applications of Virtual reality. Course methodology:This course will be driven by a collaborative problem solving based approach. Students will create virtual environments by learning to use hardware and software tools and in the process of creating such environments, grasp the underlying theory and concepts.Course Outline:Exploration of VR Toolkits;Applications of Virtual reality; gaming, scientific visualisation,education and healthcare. Use of recent consumer grade VR equipment (Google glasses, Razer Hydra, nvidia SoundStorm, etc.);programming with open source VR Toolkits; conceptual foundations of VR: 3D interactions, multi-sensory interaction, immersion, head and body position tracking, perceptual issues;modelling of virtual worlds; 3D graphics, stereo and real-time rendering; physics-based rendering. Current challenges in virtual reality.

References:

  • Conference and journal papers, online resources for open source software and VR hardware.

Prerequisites

  • Competence in at least one programming language.A course in computer graphics is a plus, but not required.This is a project-intense course.


Course objective: Study and design of machine learning techniques to improve software engineering.

Motivation: Machine learning has become an effective technique for making sense of large datasets to glean actionable insights. Large software repositories such as open source gits, smartphone app stores and student submissions in MOOCs courses contain a wealth of information. The goal of this course is to study and design state-of-the-art machine learning techniques to improve software engineering using the large amount of code available.

Syllabus: Machine learning models for program analysis, automated program repair, program synthesis, mining software repositories, representation and deep learning for software engineering, programming language processing.

References:

  • Recent research papers

Prerequisites

  • Background in programming
  • Data mining or machine learning course in CSA.


Course objective:

Study and design of machine learning techniques to improve software engineering.

Motivation and objectives of the course:

Software systems have become complex and their correctness and performance are not clearly understood or known. We identify key reasons why this could happen and provide a resource proportional model as one way to understand the performance behaviour of complex software systems.

Syllabus:

Software Bloat, Lost Throughput and Wasted Joules: An Opportunity for Green Computing: Why hardware advancements aren’t enough. Why a plea for lean software isn’t enough. A Sustainable Principle for Emerging Systems: Resource Proportional Software Design. The Problem of Software Bloat: Causes and Consequences, Forms of Software Runtime Bloat, Progress in Bloat Characterization, Measurement and Mitigation. How Bloat in Software Impacts System Power Performance. Analyzing the Interplay of Bloat, Energy Proportionality and System Bottlenecks. Design Principles to Reduce Propensity for Bloat. A systems perspective on the origin of bloat. Defining resource proportionality with respect to feature utilization to predict bloat propensity. Strategies for bloat mitigation. What Component and Tool Developers Can do. Refactoring Existing Software for Improved Resource Proportionality. Implications for Non-functional Requirements. Resource Proportional Programming for Persistent Memory Applications/ Memory Speed Fabrics Applying Resource Proportional Design Principles to a Deeply Layered Stack. Data Centric Resource Proportional Systems Design. Adapting the Systems Software Stack to a Changing Paradigm of Uniform Logical Access to a Radically Non-Uniform System. Bridging the gap from what we know today to open challenges & research topics

References:

  • Current research papers.

Prerequisites

  • 2 or more 200-level courses in software systems (OS, databases, compilers, graphics)


The theme of this course in the Jan-Apr 2015 semester is arithmetic circuit complexity. Arithmetic circuits are algebraic analogue of boolean circuits that naturally compute multivariate polynomials. The quest for a thorough understanding of the power and limitation of the model of arithmetic circuits (and its connection to boolean circuits) has lead researchers to several intriguing structural, lower bound and algorithmic results. These results have bolstered our knowledge by providing crucial insights into the nature of arithmetic circuits. Still, many of the fundamental questions/problems on arithmetic circuits have remained open till date.The aim of this course is to provide an introduction to this area of computational complexity theory. We plan to discuss several classical and contemporary results and learn about a wealth of mathematical (particularly, algebraic and combinatorial) techniques that form the heart of this subject.

References:

  • Current literature on Arithmetic circuit complexity.

Prerequisites

  • Familiarity with basic abstract algebra, linear algebra, probability theory and algorithms will be helpful. More importantly, we expect some mathematical maturity with an inclination towards theoretical computer science.


The course is composed of two parts; the first part will introduce the fundamentals of writing concurrent programs, its applicability in the context of building large scale software systems, different models of concurrency, introduction to various bug patterns. The second part will study the recent trends in designing program analysis techniques to detect bugs with a special emphasis on scalable approaches. A course project will help familiarize all the concepts learned as part of the lectures.

References:

  • Java Concurrency in Practice by Brian Goetz, Tim Peierls, Joshua Bloch, Joseph Bowbeer, David Holmes, Doug Lea, Addison-Wesley, (2006).
  • Slides and research papers listed on the course webpage.

Prerequisites

  • Previous experience with building a system will be helpful but not essential.


Tools from combinatorics is used in several areas of computer science. This course aims to teach some advanced techniques and topics in combinatorics. In particular, we would like to cover probabilistic method which is not covered in the introductory course `graph theory and combinatorics'. Moreover the course would aim to cover to some extent the linear algebraic methods used in combinatorics. We will also discuss some topics from extremal combinatorics.

Linear Algebraic methods: Basic techniques, polynomial space method, higher incidence matrices, applications to combinatorial and geometric problems. Probabilistic Methods: Basic techniques, entropy based method, martingales, random graphs. Extremal Combinatorics: Sun flowers, intersecting families, Chains and antichains, Ramsey theory.

References:

  • L. Babai and P. Frankl: Linear algebra methods in combinatorics with applications to Geometry and Computer Science, Unpublished manuscript.
  • N. Alon and J. Spenser: Probabilistic Method, Wiley Inter-science publication.
  • Stasys Jukna: Extremal Combinatorics with applications in computer science, Springer.

Prerequisites

  • Basic familiarity with probability theory, linear algebra, and graph theory and combinatorics.


Indistinguishability, real-ideal world and simulation-based security notions; Secret Sharing, Verifiable Secret Sharing, Oblivious Transfer, Circuit Garbling and function encoding, Commitment Scheme, Zero-knowledge Proof, Threshold Cryptography, Encryptions, Broadcast Byzantine Agreement, Coin-tossing protocol, Theoretical and practical protocols for secure computation in various models.

References:

  • Book: “Efficient Two-part Protocols- Techniques and Constructions” by Carmit Hazay and Yehuda Lindell.
  • Book Draft: “Secure Multiparty Computation and Secret Sharing - An Information Theoretic Appoach” by Ronald Cramer, Ivan Damgaard and Jesper Buus Nielsen.
  • Recent Research Papers.

Prerequisites

  • Mathematical maturity.
  • Basic level crypto course.


Minors: Introduction - properties which causes dense minors in graphs: average degree, girth, Wagner's characterisation of graphs without K5 minors. Tree Decompositions: treewidth, pathwidth, upper and lower bounds for treewidth, relation of treewidth and minors, influence on algorithmic graph problems. Hadwiger's conjecture - its relation with the four colour theorem, related work.

References:

  • Graph Theory (Chapters 8 and 12), Reinhard Diestel, Springer, 2000.
  • Current Literature


The course will consist of two parts: Computational aspects of algebra & number theory ; Use of algebraic methods in theoretical computer science. Part 1: Chinese remaindering, Discrete Fourier Transform, Resultant of polynomials, Hensel lifting, Automorphisms of rings, Short vectors in Lattices, Smooth numbers etc. - and show how these tools are used to design algorithms for certain fundamental problems like integer & polynomial factoring, integer & matrix multiplication, fast linear algebra, root finding, primality testing, discrete logarithm etc. Part 2: This will deal with certain applications of algebraic methods/algorithms in cryptography (RSA cryptosystem, Diffie-Hellman), coding theory (Reed-Solomon & Reed-Muller codes, locally decodable codes), analysis of boolean functions (Fourier analysis), and construction of expander graphs.

References:

  • Modern Computer Algebra by von zur Gathen and Gerhard.
  • Introduction to Finite Fields by Lidl & Niederreiter.
  • Relevant research papers and online lecture notes.

Prerequisites

  • Basic familiarity with linear algebra and properties of finite fields (as in the Chapter 1-3 of the book 'Introduction to finite fields and their applications' by Rudolf Lidl and Harald Niederreiter). Alternatively, an undergraduate course in algebra. Most importantly, some mathematical maturity with an inclination towards theoretical computer science.


In this course, we aim to study algorithmic approaches for automating 1. Synthesis of programs, 2. Discovering specifications of programs, 3. Selection of domain-specific algorithms. Along with presentations by course instructors, every participant will be assigned a few papers to be presented in the class. The exchange of knowledge will be mainly through open discussions in the classes. An optional course project will be offered for interested participants. The evaluation will be based on quality of presentations, understanding of material, and participation in the class discussions.

References:

  • A number of classic as well as recent research papers have been identified carefully. The list can be made available if required. There are no specific text book references for the course.

Prerequisites

  • Program Analysis and Verification (E0 227) or Automated Verification (E0 223); in other cases, you can seek permission from the instructors.


Basic Parameterized Algorithms: Depth-bounded search trees, iterative compression, color coding, treewidth and dynamic programming over graphs of bounded treewidth. Advanced themes: MSO logic, tree automata, courcelle's theorem (a practical perspective). Influence of randomization on parameterized algorithms. Applications of parameterized algorithms in computational biology, information security, and learning.

References:

  • Niedermier, An Invitation to Fixed-Parameter Algorithms.
  • Downey and Fellows, Parameterized Complexity


Dataflow analysis: applications in program verification and transformation. Type systems: applications in software development and verification. Program slicing: Applications in software development. Techniques for points-to analysis. Symbolic execution: Applications in program testing. Model checking of software using abstractions. Program logics: applications in program verification. Techniques for testing and verification of concurrent programs.

References:

  • Research papers

Prerequisites

  • Program Analysis and Verification (E0 227)


Convex sets and functions, Convex Optimization Problems, Duality, Approximation and fitting, Statistical Estimation, Geometric Problems, Unconstrained minimization, Interior-point methods.

References:

  • S. Boyd and L. Vandenberghe: Convex Optimization, Cambridge University Press, 2004.


Convex Optimization - Introduction, Incremental Gradient, Subgradient and Proximal Methods. Nonsmooth Convex Optimization, DC (Difference of Convex functions) Programming, Lagrangian Relaxation – Dual Decomposition. Augmented Lagrangian Methods, Cutting Plane Methods, Large-Scale Learning - Approximate Optimization.

References:

  • Optimization for Machine Learning, Suvrit Sra, Sebastian Nowozin and Stephen Wright (Editors), The MIT Press, Dec. 2011.
  • Recent Literature

Prerequisites

  • A course in Machine Learning or Data Mining


Probability spaces and measure theory, Borel Sigma-Algebras and Random Variables, Lebesgue theory of integration, expectation, Radon Nikodym theorem, Shannon entropy and I-divergence, GYP-theorem for I-divergence, Pinsker inequality, stochastic process and entropy rate, product spaces and Fubini’s Theorem, probability on metric spaces, conditional expectation, martingales, introduction to stochastic calculus.

References:

  • Billingsley, P., Convergence of Probability Measures, Wiley-interscience, 1999.
  • Borkar, V. S., Probability Theory : An Advanced Course, Springer-Verlag, 1995.
  • K. R. Parthasarathy, Coding Theorems of Classical and Quantum Information theory TRIM publication, 2007.
  • I. Karatzas and S.E. Shreve, Brownian Motion and Stochastic Calculus, Springer; 2nd edition 1991.

Prerequisites

  • Any basic course in Probability.


Emerging encryption primitives like identity-based encryption, attribute-based encryption, predicate encryption, functional encryption etc. Cryptographic protocols for privacy preserving computation, secure storage and cloud. Revisiting the security definition and security reduction with an emphasis on concrete security and the interplay of functionality, security and efficiency of cryptographic protocols. Cryptanalysis of provable security.

References:

  • A selection of research papers from journals and conference proceedings.

Prerequisites

  • Cryptography (E0 235).


Entropy notions such as min-entropy, shannon entropy etc. Computational variants of these notions and the challenges in analyzing them. Randomness extractors, privacy amplification protocols, leakage-resilient Cryptography. Design of error correcting codes with specialized properties (motivated by various cryptographic applications) - e.g., non-malleable codes, updatable codes etc.

References:

  • Research papers.

Prerequisites

  • An undergraduate course on Probability Theory will be helpful.


Architecture and harware description languages (RTL, ISPS, vhdl). Processor architecture, Instruction level parallelism, Latency tolerance, multithreading, interconnection networks, Standards (bus, SCI), architectures, routing, Cache coherency, protocol specification, correctness, performance. Memory consistency models, synchronization primitives, parallel programming paradigms, I/O systems, Interface standards, parallel I/O, performance evaluation, analytical methods, simulation algorithms and techniques, benchmarking.

Prerequisites

  • Computer Architecture, Operating Systems, Some Familiarity with Analytical Performance Evaluation Techniques


Regression, feature selection, ensemble methods (boosting, bagging, etc) and HMM models. Selected topics in OS (related to the papers under discussion and including, as necessary, some review of required background)

References:

  • Current Literature (Conference proceedings of SOSP, SysML, etc)

Prerequisites

  • Background in atleast one computer systems area like OS, Databases, Compilers etc. and Instructors' consent.


Selected topics in operating systems of topical interest. Design, implementation, correctness and performance related aspects. Past offerings included study of subsystems such as process, storage and network subsystems.

References:

  • Recent Literature

Prerequisites

  • Consent of instructor and a course in Operating Systems, Computer Architecture with some familiarity of the internals of Linux/Unix


Dynamic and Just-In-Time compilation. Compilation for embedded systems: performance, memory, and energy considerations. Static analysis: points-to analysis, abstract interpretation. WCET estimation. Type systems. Optimizations for O-O languages. Compilation for multi-core systems.
This course will be based on seminars and mini projects.

References:

  • Y.N. Srikant and Priti Shankar (ed.), The Compiler Design Handbook: Optimizations and Machine Code Generation, 2nd ed., CRC Press, 2008.

Prerequisites

  • Good knowledge of dataflow analysis and compiler optimizations


Parallel architectures: a brief history, design, Auto-parallelization for multicores, GPUs, and distributed Memory clusters Lock-free and wait-free data structures/algorithms for parallel programming Study of existing languages and models for parallel and high performance programming; issues in design of new ones.

References:

  • Aho, Lam, Sethi, and Ullman, Compilers: Principles, Techniques, and Tools, 2nd edition
  • Herlihy and Shavit, The Art of MultiProcessor Programming
  • Ananth Grama, Introduction to Parallel Computing
  • List of research papers and other material which will be the primary reference material will be available on course web page.

Prerequisites

  • Knowledge of "E0 255 Compiler Design" course content (especially on parallelization) will be very useful, but not absolutely necessary.
  • Knowledge of microprocessor architecture and some basic understanding of parallel programming models.


Object-oriented Databases, Distributed and Parallel Databases, Multi-databases, Access Methods, Transaction Management, Query Processing, Deductive Databases, multimedia Databases, Real- Time Databases, Active Databases, Temporal Databases, Mobile Databases, Database Benchmarks, Database Security, Data Mining and Data Warehousing.

References:

  • Readings in Database Systems edited by M. Stonebraker, Morgan Kaufmann, 2nd ed., 1994.
  • Conference and Journal papers


Wireless Technologies: Land Mobile Vs. Satellite Vs. In-building Communications Systems, Cellular Telephony, Personal Communication Systems/Networks. Wireless Architectures for Mobile Computing. Applications. Wireless LANs, Wireless Networking, Hand-off, Media Access Methods, Mobile IP, Unicast and Multicast Communication, Wireless TCP, Security Issues. Mobile Computing Models, System-Level Support, Disconnected Operation, Mobility, Failure Recovery. Information Management, Broadcast, Caching, Querying Location Data. Location and Data Management for Mobile Computing, Hierarchical Schemes, Performance Evaluation. Case Studies.

References:

  • Current Literature from IEEE Transactions, Journals,and Conference Proceedings.
  • Abdelsalam A. Helal et al, Any Time, Anywhere Computing : Mobile Computing Concepts and Technology, Kluwer International Series in Engineering and Computer Science, 1999.
  • Evaggelia Pitoura and Geaorge Samaras, Data Management for Mobile Computing, Kluwer International Series on Advances in Database Management,October 1997.

Prerequisites

  • Consent of the Instructor


Theoretical foundations of modern machine learning. Kernel methods: support vector machines. Ensemble methods: boosting. Generalization analysis: VC-dimension bounds, covering numbers, margin analysis, Rademacher averages, algorithmic stability. Statistical consistency analysis. PAC learning. Online learning and regret bounds. Selected additional topics of current interest.

References:

  • Devroye, L, Gyorfi L, and Lugosi G, A Probabilistic Theory of Pattern Recognition. Springer, 1996.
  • Anthony M, and Bartlett P L, Neural Network Learning: Theoretical Foundations. Cambridge University Press, 1999.
  • Vapnik V N, Statistical Learning Theory. Wiley-Interscience, 1998.
  • Current literature.

Prerequisites

  • A strong foundation in probability and statistics, and some previous exposure to machine learning. Some background in linear algebra and optimization will be helpful.


Review of Directed Graphical Models: Semantics; Exact Inference using Junction Tree Algorithm, Complexity Analysis; Parameter Estimation. Approximate Inference: Loopy Belief Propagation & Generalized Belief Propagation; Sampling techniques, Variational Techniques. Case Study: Latent Dirichlet Allocation. Non parametric Bayesian Models: Dirichlet Processes, Chinese Restaurant Processes and Polya Urn; Hierarchical Dirichlet Processes and Chinese Restaurant Franchise; Infinite Mixture Models and Indian Buffet Processes; Sequential models, Hidden Markov Dirichlet Processes and Hierarchical Dirichlet Process HMM; Hierarchical models, Nested Chinese Restaurant Processes; Dynamic models, Recurrent Chinese Restaurant Processes. Efficient Inference: Parallel, distributed and online algorithms.

References:

  • Textbooks: Current literature.

Prerequisites

  • Machine Learning, Consent of the instructor.


Sequence Alignment, Global and Local Alignment, Hidden Markov Models and their Applications in sequence processing, Phylogenetics, Bayesian Statistics, Sampling Algorithms, Clustering, Classification of Gene expression datasets, Support vector machines, Optimization, Principal Component Analysis.

References:

  • R. Durbin, S. Eddy, A. Krogh, G. Mitchison, Biological Sequence Analysis Cambridge University Press, 1998.
  • M.S. Waterman, Introduction to Computational Biology Maps, Sequences and Genomes, Chapman and Hall - CRC Press, 1995.
  • Current Literature

Prerequisites

  • Consent of the Instructor.


Topological methods for analyzing scientific data; efficient combinatorial algorithms in Morse theory; topological data structures including contour trees, Reeb graphs, Morse-Smale complexes, Jacobi sets, and alpha shapes; robustness and application to sampled data from simulations and experiments; multi-scale representations for data analysis and feature extraction; application to data exploration and visualization.

References:

  • Textbooks: Course material will consist of current literature and lecture notes.

Prerequisites

  • Basic familiarity with fundamental algorithms and data structures is desirable (E0 225 or E0 251). Familiarity with the basics of scientific visualization will be useful but not essential. Interested students with a non-CS background may also register for the course after consent of instructor.


Fundamental Theorems: Radon's theorem, Helly's theorem. Geometric graphs: Proximity graphs, geometric results on planar graphs. Geometric incidences: Incidence bounds using cuttings technique, crossing lemma. Distance based problems: Bounds on repeated distances and distinct distances. Epsilon Nets: Epsilon Net theorem using random sampling and discrepency theory, epsilon nets for simple geometric spaces, weak epsilon nets.

References:

  • Janos Pach and Pankaj K. Agarwal: Combinatorial Geometry, Wiley, 1st edition, 1995.
  • J. Matousek: Lectures on Discrete Geometry, Springer-Verlag, 1st edition, 2002.
  • Current literature

Prerequisites

  • The registrants should have preferably completed the "Design and Analysis of Algorithms" or "Discrete Structures" course.


Probability and basic information theory, universal data compression, I-projections and iterative algorithms for estimation with applications to statistics, large deviations and hypothesis testing, probabilities on metric spaces and information topology, Kolmogorov complexity, Applications of IT to other areas such as ergodic theory, gambling, biology.

References:

  • Information theory and Statistics: A Tutorial by I. Csisz_ar and P. Shields, Now Publications, 2008.
  • Elements of Information Theory, by T. M. Cover and J. A. Thomas, John Wiley and Sons, 2nd edition, 2006.
  • Information and Distribution: Occam's Razor in Action by P. Harremoes and A. Dukkipati, (in preparation) 2008.
  • Coding Theorems of Classical and Quantum Information theory by K. R. Parthasarathy, TRIM publication, 2007.
  • Information Theory, Inference, and Learning Algorithms by D.J.C. MacKay, Cambridge University Press, 2003.

Prerequisites

  • Basic probability theory or consent of instructor.


Preliminaries, polynomials, factorization of polynomials, Finite Fields, Berlekamp's algorithm, Hensel's lifting, LLL algorithm, applications to error correcting codes, the turnpike problem, some group theory, special cases of graph isomorphism, algorithms for primality testing.

References:

  • Joachim von zur Gathen and Jürgen Gerhard: Modern Computer Algebra
  • Relevant research papers and online notes.


Representing processing data as high-dimensional points, Random graphs and other random models, Probability Concentration phenomena, Eigen Values, Eigen vectors, Singular Value Decomposition and Algorithmic applications, Massive Matrix Computations using randomized algorithms, Learning Algorithms, Optimization.

References:

  • Current Literature

Prerequisites

  • A solid undergrad background in Calculus, Linear Algebra, Probability and exposure to Algorithms.


Graph Theory: Connectivity, Matchings, Hamiltonian Cycles, Coloring Problems; Network flows, special classes of graphs. Introduction to Graph Minor theory. Combinatorics: Basic Combinatorial Numbers, Recurrence Relations, Inclusion-Exclusion Principle, Introduction to Polya Theory. Probabilistic Method in Graph theory: Basic Method, Expectation, Chernoff bound, Lovasz Local Lemma.

References:

  • J. H. Van Lint, R. M. Wilson, A Course in Combinatorics, Cambridge University Press, 1993.
  • N. Alon and J. Spenser, "Probabilistic Methods", John Wiley and Sons, 2nd edition, 2000.
  • R. Diestel, "Graph Theory", Springer-Verlag, 2nd edition, 2000.


Part I: Performance Analysis
Introduction to multi-tier application performance characteristics, Measurement-based performance analysis of distributed applications, Analytical Performance modeling of multi-tier applications, Layered Queueing models (generic) Case studies of performance analysis of specific technologies (E.g. web server, virtual machines).
Part II: Performance Management
Overload control mechanims, QoS guarantee mechanisms, Dynamic resource provisioning mechanisms (e.g. in virtualized platforms), Power-aware performance management.

References:

  • Scaling for e-business: technologies, models, performance, and capacity planning, Daniel A. Menascé, Virgilio A. F. Almeida, Prentice-Hall, 2000.
  • Papers:
  • Woodside, Neilson, Petriu, Majumdar, The Stochastic Rendezvous Network Model for Performance of Synchronous Client-Server-like Distributed Software, IEEE Trans. On Computers, January 1995 (vol. 44 no. 1) pp. 20-34.
  • Rolia and Sevcik, The Method of Layers, IEEE Transactions on Software Engineering, Volume 21 , Issue 8 (August 1995), Pages: 689 - 700.
  • John Dilley, Rich Friedrich, Tai Jin, Jerome Rolia, Web server performance measurement and modeling techniques, Performance Evaluation, Volume 33 , Issue 1 (June 1998), Special issue on tools for performance evaluation, Pages: 5 - 26
  • Paul Barford, Mark Crovella, Generating representative Web workloads for network and server performance evaluation, ACM SIGMETRICS Performance Evaluation Review, Volume 26, Issue 1 (June 1998), Pages: 151 - 160.
  • TF Abdelzaher, KG Shin, N Bhatti, Performance guarantees for web server endsystems: A control-theoretical approach, IEEE Transactions on Parallel and Distributed Systems, 2002.
  • Comparison of the three CPU schedulers in Xen, L Cherkasova, D Gupta, A Vahdat - Performance Evaluation Review, 2007.
  • B Urgaonkar, P Shenoy, A Chandra, P Goyal, Agile dynamic provisioning of multi-tier Internet applications, ACM Transactions on Autonomous and Adaptive Systems (TAAS), Volume 3 , Issue 1 (March 2008).
  • Jeffrey S. Chase, Darrell C. Anderson, Prachi N. Thakar, Amin M. Vahdat, Ronald P. Doyle, Managing energy and server resources in hosting centers, December 2001, SOSP '01: Proceedings of the eighteenth ACM symposium on Operating systems principles.

Prerequisites

  • It will be very useful to have a background in queuing systems (as provided in course E0 241, or any equivalent course from other departments). Undergraduate level background in Operating Systems and Computer Networking will be assumed. Students should be comfortable with a broad range of quantitative methods generally required in engineering.


Application performance characteristics; Performance metrics, their fundamental behaviour with respect to allocated resources, offered load, etc; Overview of virtualization, Virtual Machines (e.g. Xen, KVM, VMware), Performance Isolation in virtual machines, Xen CPU Schedulers, schedulers in other VMs., Live migration; Understanding energy as a resource, Energy consumption behaviour of server machines, how power consumption can be controlled; Cloud Computing: overview, brief case studies , Dynamic and autonomic resource management in clouds, Resource allocation within one physical machine , Methods based on control theory, reinforcement learning, and other methods; Resource Management of a virtualized cluster – specifically approaches for power usage reduction; Methods based on control theory, reinforcement learning, and other methods.

References:

  • The Definitive Guide To The Xen Hypervisor (Series - Prentice Hall Open Source Software Development) by David Chisnall
  • Running Xen: A Hands-on Guide To The Art Of Virtualization by Jeanna Matthews, Eli M. Dow, Todd Deshane. Prentice Hall.

Prerequisites

  • Undergraduate level background in Operating Systems and Computer Networking will be assumed.


Introduction to pattern recognition, Bayesian decision theory, supervised learning from data, parametric and non parametric estimation of density functions, Bayes and nearest neighbor classifiers, introduction to statistical learning theory, empirical risk minimization, discriminant functions, learning linear discriminant functions, Perceptron, linear least squares regression, LMS algorithm, artificial neural networks for pattern classification and function learning, multilayer feed forward networks, backpropagation, RBF networks, support vector machines, kernel based methods, feature selection and dimensionality reduction methods.

References:

  • R. O Dudo, P.E. Hart & D. G. Stork, Pattern Classification John Wiley & sons, 2002.
  • C.M Bishop, Neural Network & Pattern Recognition, Oxford University Press(Indian Edition) 2003.

Prerequisites

  • Knowledge of Probability theory.


Syntax: syntactic processing; linguistics; parts-of-speech; grammar and parsing; ambiguity resolution; tree adjoint grammars. Semantics: semantic interpretation; word sense disambiguation; logical form; scoping noun phrases; anaphora resolution. Pragmatics: context and world knowledge; knowledge representation and reasoning;local discourse context and reference; discourse structure; semantic web; dialogue; natural language understanding and generation. Cognitive aspects: mental models, language acquisition, language and thought; theories of verbal field cognition. Applications: text summarization, machine translation, sentiment analysis, perception evaluation, cognitive assistive systems; NLP tool-kits augmentation.

References:

  • Allen J, Natural language understanding, Pearson Education, 1995, 2003.
  • Jurafsky D, and Martin J H, Speech and language processing: an introduction to natural language processing, computational linguistics and speech recognition, Pearson Education, 2000, 2003.
  • Posner M I, Foundations of Cognitive Science, MIT Press, 1998.
  • Research Literature.

Prerequisites

  • Familiarity with programming (optionally including scripting languages); data structures, algorithms and discrete structures; reasonable knowledge of English language.


Introduction: rationality, intelligence, common knowledge, von Neumann - Morgenstern utilities; Noncooperative Game Theory: strategic form games, dominant strategy equilibria, pure strategy nash equilibrium, mixed strategy Nash equilibrium, existence of Nash equilibrium, computation of Nash equilibrium, matrix games, minimax theorem, extensive form games, subgame perfect equilibrium, games with incomplete information, Bayesian games. Mechanism Design: Social choice functions and properties, incentive compatibility, revelation theorem, Gibbard-Satterthwaite Theorem, Arrow's impossibility theorem, Vickrey-Clarke-Groves mechanisms, dAGVA mechanisms, Revenue equivalence theorem, optimal auctions. Cooperative Game Theory: Correlated equilibrium, two person bargaining problem, coalitional games, The core, The Shapley value, other solution concepts in cooperative game theory.

References:

  • Roger B. Myerson, Game Theory: Analysis of Conflict, Harvard University Press, September 1997.
  • Martin J. Osborne, An Introduction to Game Theory, Oxford University Press, 2003.
  • Y. Narahari, Dinesh Garg, Ramasuri Narayanam, Hastagiri Prakash. Game Theoretic Problems in Network Economics and Mechanism Design Solutions. Springer, 2009.


Introduction to reinforcement learning, introduction to stochastic dynamic programming, finite and infinite horizon models, the dynamic programming algorithm, infinite horizon discounted cost and average cost problems, numerical solution methodologies, full state representations, function approximation techniques, approximate dynamic programming, partially observable Markov decision processes, Q-learning, temporal difference learning, actor-critic algorithms.

References:

  • D.P.Bertsekas and J.N.Tsitsiklis, Neuro-Dynamic Programming, Athena Scientific, 1996.
  • R.S.Sutton and A.G.Barto, Reinforcement Learning: An Introduction, MIT Press, 1998.
  • D.P.Bertsekas, Dynamic Programming and Optimal Control, Vol.I, Athena Scientific, 2005.


Foundations of pattern recognition. Soft computing paradigms for classification and clustering. Knowledge-based clustering. Association rules and frequent itemsets for pattern recognition. Large-scale pattern recognition.

References:

  • R. O. Duda, P. E. Hart, and D.G. Stork, Pattern Classification, John Wiley & Sons (Asia), Singapore, 2002
  • Recent Literature.


Biological cerses computational dichotomy, critical computer - anatomy of neocortex, 100 steps at 5 msec rule, symbolic architecture, connectionist approach, multi-sensory-motor information, hierarchical, network, pyramidal models, spatio-temporal pattern matching, pattern representation and storage, invariant representations, sequences of sequences, autoassociative, content addressable memory retrieval, memory prediction paradigm, domains: language acquiaition, vision and attention, mental models, design and development of thought experiments and simulation.

References:

  • Posner M I, Foundations of Cognitive Science, The MIT Press, 1993.
  • Books and Survey Articles by: M. Minsky, A. Newell, H.A. Simon, D.E. Rumelhart, T. Sejnowski, J. Barwise, N. Chomsky, S. Pinker, V.S. Ramachandran and others


Foundational results in game theory and mechanism design: Nash's existence theorem, Arrow's impossibility theorem, Gibbard Satterthwaite theorem, etc.; Selected topics in repeated games, evolutionary games, dynamic games, and stochastic games; Selected topics at the interface between game theory, mechanism design, and machine learning; Selected topics in algorithmic game theory; Modern applications of game theory and mechanism design: incentive compatible learning, social network analysis, etc.

References:

  • Roger B. Myerson, Game Theory: Analysis of Conflict, Harvard University Press, September 1997.
  • Rakesh V. Vohra: Advanced Mathematical Economics. Routledge, New York, NY, 2005.
  • Andreu Mas-Colell, Michael D. Whinston, and Jerry R. Green: Microeconomic Theory. Oxford University Press, New York, 1995.
  • Current Literature

Prerequisites

  • Elementary knowledge of linear algebra, linear programming, algorithms, game theory is useful for this course.


Markov decision processes, finite horizon models, infinite horizon models under discounted and long-run average cost criteria, classical solution techniques -- policy iteration, value iteration, problems with perfect and imperfect state information. Reinforcement learning, solution algorithms -- Q-learning, TD(lambda), actor-critic algorithms.

References:

  • D.P.Bertsekas, Dynamic Programming and Optimal Control, Vol.I and II, Athena Scientific, 2005.
  • D.P.Bertsekas and J.N.Tsitsiklis, Neuro-Dynamic Programming, Athena Scientific, 1996.
  • R.S.Sutton and A.G.Barto, Reinforcement Learning: An Introduction, MIT Press, 1998.
  • Selected Research Papers.

Prerequisites

  • A course on probability theory and stochastic processes. Knowledge of nonlinear programming is desirable.


Introduction to Stochastic approximation algorithms, ordinary differential equation based convergence analysis, stability of iterates, multi-timescale stochastic approximation, asynchronous update algorithms, gradient search based techniques, topics in stochastic control, infinite horizon discounted and long run average cost criteria, algorithms for reinforcement learning.

References:

  • H.J.Kushner and G.Yin, Stochastic approximation and recursive algorithms and applications (2nd edition), Springer Verlag, New York, 2003.
  • A.Benveniste, M.Metiview and P.Priouret, Adaptive algorithms and stochastic approximation, Springer-Verlag,1990.
  • V.S.Borkar,Stochastic Approximation: A Dynamical Systems Viewpoint, Hindustan Book Agency, 2008.
  • D.P.Bertsekas and J.N.Tsitsiklis, Neuro-dynamic programming, Athena Scientific, 1996.
  • Relevant research papers

Prerequisites

  • A basic course on probability theory and stochastic processes