<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>Suresh Venkatasubramanian &#187; CCF 0841185</title>
	<atom:link href="http://www.cs.utah.edu/~suresh/web/tag/sger/feed/" rel="self" type="application/rss+xml" />
	<link>http://www.cs.utah.edu/~suresh/web</link>
	<description></description>
	<lastBuildDate>Tue, 26 Feb 2013 16:47:11 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>hourly</sy:updatePeriod>
	<sy:updateFrequency>1</sy:updateFrequency>
	<generator>http://wordpress.org/?v=3.5</generator>
		<item>
		<title>Computing Hulls, Centerpoints and VC dimension  in Positive Definite Space</title>
		<link>http://www.cs.utah.edu/~suresh/web/2011/08/10/computing-hulls-in-positive-definite-space/</link>
		<comments>http://www.cs.utah.edu/~suresh/web/2011/08/10/computing-hulls-in-positive-definite-space/#comments</comments>
		<pubDate>Wed, 10 Aug 2011 07:16:01 +0000</pubDate>
		<dc:creator>suresh</dc:creator>
				<category><![CDATA[Papers]]></category>
		<category><![CDATA[CCF 0841185]]></category>

		<guid isPermaLink="false">http://www.cs.utah.edu/~suresh/web/?p=69</guid>
		<description><![CDATA[[author]P. Thomas Fletcher, John Moeller, Jeff Phillips and Suresh Venkatasubramanian[/author] In Algorithms And Data Structures Symposium (formerly WADS), 2011. Abstract: Many data analysis problems in machine learning, shape analysis, information theory and even mechanical engineering involve the study and analysis of collections of positive definite matrices. The space of such matrices P(n) is a Riemannian [...]]]></description>
				<content:encoded><![CDATA[<p>[author]P. Thomas Fletcher, John Moeller, Jeff Phillips and Suresh Venkatasubramanian[/author]<br />
In <a href="http://www.wads.org/">Algorithms And Data Structures Symposium</a> (formerly WADS), 2011.<br />
<span id="more-69"></span><br />
Abstract:</p>
<p>Many data analysis problems in machine learning, shape analysis, information theory and even mechanical engineering involve the study and analysis of collections of positive definite matrices. The space of such matrices P(n) is a Riemannian manifold with variable negative curvature. It includes Euclidean space and hyperbolic space as submanifolds, and poses significant challenges for the design of algorithms for data analysis. </p>
<p>In this paper, we develop foundational geometric structures and algorithms for analyzing collections of such matrices. A key technical contribution of this work is the use of  \emph{horoballs}, a natural generalization of halfspaces for non-positively curved Riemannian manifolds. Horoballs possess some desirable properties of halfspaces (and balls) but are fundamentally more complex to work with because of the inherent curvature of the underlying space. </p>
<p>We propose generalizations of the notion of a convex hull and a centerpoint and develop algorithms for constructing such structures approximately by combining structural properties of horoballs with novel decompositions of P(n). Using these, we also prove that the VC-dimension of range spaces defined by horoballs is bounded in the case of P(2) (2 x 2 symmetric positive definite matrices). </p>
<p><strong>Links</strong>: </p>
<ul>
<li><a href="http://www.cs.utah.edu/~suresh/web/wp-content/uploads/2009/10/paper.pdf">2 page version</a> at the <a href="http://www.cs.tufts.edu/research/geometry/FWCG09/">19th Fall Workshop on Computational Geometry</a></li>
<li>Original version at the arxiv (<a href="http://arxiv.org/abs/0912.1580">arXiv:0912.1580v2 [cs.CG]</a>)</li>
<li><a href="http://www.cs.utah.edu/~suresh/papers/psd/paper.pdf">Latest version</a> (restructured, including new results on VC-dimension):</li>
</ul>
<hr />
This material is based upon work supported by the National Science Foundation under Grant No. 0841185</p>
]]></content:encoded>
			<wfw:commentRss>http://www.cs.utah.edu/~suresh/web/2011/08/10/computing-hulls-in-positive-definite-space/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>Active Supervised Domain Adaptation</title>
		<link>http://www.cs.utah.edu/~suresh/web/2011/07/29/active-supervised-domain-adaptation/</link>
		<comments>http://www.cs.utah.edu/~suresh/web/2011/07/29/active-supervised-domain-adaptation/#comments</comments>
		<pubDate>Fri, 29 Jul 2011 22:11:09 +0000</pubDate>
		<dc:creator>suresh</dc:creator>
				<category><![CDATA[Papers]]></category>
		<category><![CDATA[CCF 0841185]]></category>
		<category><![CDATA[CCF 0953066]]></category>

		<guid isPermaLink="false">http://www.cs.utah.edu/~suresh/web/?p=239</guid>
		<description><![CDATA[[author]Avishek Saha, Piyush Rai, Hal Daumé III, Suresh Venkatasubramanian, and Scott L. DuVall[/author] In the European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases (ECML-PKDD 2011) Abstract: In this paper, we harness the synergy between two important learning paradigms, namely, active learning and domain adaptation. We show how active learning [...]]]></description>
				<content:encoded><![CDATA[<p>[author]Avishek Saha, Piyush Rai, Hal Daumé III, Suresh Venkatasubramanian, and Scott L. DuVall[/author]<br />
<em>In the European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases (<a href="http://www.ecmlpkdd2011.org/index.php">ECML-PKDD 2011</a>)</em><br />
<span id="more-239"></span><br />
<strong>Abstract:</strong><br />
In this paper, we harness the synergy between two important learning paradigms, namely, active learning and domain adaptation. We show how active learning in a target domain can leverage information from a different but related source domain. Our proposed framework, Active Learning Domain Adapted (Alda), uses source domain knowledge to transfer information that facilitates active learning in the target domain. We propose two variants of Alda: a batch B-Alda and an online O-Alda. Empirical comparisons with numerous baselines on real-world datasets establish the efficacy of the proposed methods.</p>
<p>Links: <a href="http://www.cs.utah.edu/~suresh/papers/ecml2011/alda.pdf">PDF</a></p>
]]></content:encoded>
			<wfw:commentRss>http://www.cs.utah.edu/~suresh/web/2011/07/29/active-supervised-domain-adaptation/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>Johnson-Lindenstrauss Dimensionality Reduction on the Simplex</title>
		<link>http://www.cs.utah.edu/~suresh/web/2010/10/15/johnson-lindenstrauss-dimensionality-reduction-on-the-simplex/</link>
		<comments>http://www.cs.utah.edu/~suresh/web/2010/10/15/johnson-lindenstrauss-dimensionality-reduction-on-the-simplex/#comments</comments>
		<pubDate>Fri, 15 Oct 2010 07:33:42 +0000</pubDate>
		<dc:creator>suresh</dc:creator>
				<category><![CDATA[Papers]]></category>
		<category><![CDATA[CCF 0841185]]></category>
		<category><![CDATA[CCF 0953066]]></category>

		<guid isPermaLink="false">http://www.cs.utah.edu/~suresh/web/?p=217</guid>
		<description><![CDATA[[author]Rasmus J. Kyng, Jeff M. Phillips and Suresh Venkatasubramanian[/author] In the 20th Fall Workshop on Computational Geometry, 2010. We propose an algorithm for dimensionality reduction on the simplex, mapping a set of high-dimensional distributions to a space of lower-dimensional distributions, whilst approximately preserving pairwise Hellinger distance between distributions. By introducing a restriction on the input [...]]]></description>
				<content:encoded><![CDATA[<p>[author]Rasmus J. Kyng, Jeff M. Phillips and Suresh Venkatasubramanian[/author]<br />
In the <a href="http://www.ams.sunysb.edu/~jsbm/fwcg-2010.html">20th Fall Workshop on Computational Geometry</a>, 2010.</p>
<p><span id="more-217"></span><br />
We propose an algorithm for dimensionality reduction on the simplex, mapping a set of high-dimensional distributions to a space of lower-dimensional distributions, whilst approximately preserving pairwise Hellinger distance between distributions. By introducing a restriction on the input data to distributions that are in some sense quite smooth, we can map $n$ points on the $d$-simplex to the simplex of $O(\eps^{-2}\log n)$ dimensions with $\eps$-distortion with high probability. The techniques used rely on a classical result by Johnson and Lindenstrauss on dimensionality reduction for Euclidean point sets and require the same number of random bits as non-sparse methods proposed by Achlioptas for database-friendly dimensionality reduction.</p>
<p>Links. <a href="http://www.cs.utah.edu/~suresh/papers/jlsimplex/fwcg10.pdf">PDF</a></p>
]]></content:encoded>
			<wfw:commentRss>http://www.cs.utah.edu/~suresh/web/2010/10/15/johnson-lindenstrauss-dimensionality-reduction-on-the-simplex/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
	</channel>
</rss>