Somehow I made it out alive after that deadline. I may have knocked five years off my overall lifespan, but the paper has officially been submitted. Dr. Rosen was a real trooper this week: I probably sent him half the amount of emails that I’ve sent him in the entirety of us working together in this past week alone. We’re both super happy with the way our paper came out, and it’d be awesome to have this paper together before I graduate from USF. This week we’re meeting to talk grad school and figure out the next steps with our 1-dimensional tool. I leave in late May/early June for my internship at NREL, so I’m going to try my best[…]
This week was super stressful. Our Vis deadline is 3/31 and I’ve pulled three all-nighters in one week. I remember when I was back at FSU as a first year music student I used to pull all-nighters like it was NOTHING! Our paper is really wrapping together and I have to say, I totally love it! I’m really happy about the progress we made and I really hope it gets in to Vis this year. Germany would be awesome to visit! I’m finishing the results section this week and also cleaning up the code to submit. We’re redoing our video presentation and also packaging up our software so reviewers can play with the graphs themselves. I’ve learned so much about[…]
In addition to the synthetic datasets we’ve generated, another aspect of my job these next few weeks is finding more real world graphs to run our tool on. I’ve found some really neat ones, and I’ll link the websites at the bottom of my blog post. Noticeably, our tool works really well on collaboration networks and social networks. It’s really interesting to see these dominant structures form using PH on graphs, the more I learn about working with graphs really makes me want to continue doing so through my PhD. Our Vis deadline is March 31st, so there’s a lot to do! weighted, undirected networks: https://toreopsahl.com/datasets/ network data: http://www-personal.umich.edu/~mejn/netdata/ synthetic datsets: https://sparse.tamu.edu/
So we decided to resubmit our 0-dimensional PH FDL paper to Vis ’18. One of the biggest critiques our reviewers had from EuroVis was that we didn’t test enough datasets (mainly, larger ones). This week we’ve been focusing on finding large datasets and synthetically generated artificial ones, as well. Mustafa kindly generated about 20 of them (wow!) for us so I’ve been testing them and seeing what kind of layouts and structures our tool produces. So far, the results have been great! We’re going to continue focusing on expanding our results section to hopefully have something more compelling to show to the reviewers this round.
Part of what we’re reworking for the 0d project is adding more datasets. These need to include graphs with 5,000-10,000 nodes, some synthetic and some real-world datasets. This week I’m working on finding large graphs and cleaning up the code for our project. I’m also going to be implementing some other methods to compare against ours. We’ll hopefully have a table of various methods with graph layouts for every method. We’re aiming for the Vis ’18 deadline, which only gives us 3.5 weeks, so wish us luck!
This week we’ve been revisiting the 0-dimensional PH project to submit to either Vis ’18 or TVCG. Part of this has been optimizing how the force-directed layout runs — we changed it from n^2 to n*log(n). We’ve also been working on cleaning up the code to use as part of the submission. I’m presenting our paper to the data visualization class so I’ve been taking new pictures for the project and updating my slides. This past week I visited the University of British Columbia! It was my first time in Canada and it was great. In two weeks I’ll be visiting UMD college park, and finally I’ll be done with all my visits. The Vis ’18 deadline is March 31st,[…]
I think there’s a previous blog post with the same title. This past week I’ve worked on adding some extra features to the random walk algorithm which will choose edges with certain probability instead of uniformly at random. I’m also abstracting the data to take in any val file and calculate the cycles from there. I found out this past week our paper didn’t get accepted to EuroVis, so now we look to either submit into TVCG or include that work with our current work for one paper to Vis. I’m meeting with Dr. Rosen tomorrow to figure this next step out together. I finished visiting Tufts this past weekend and it was amazing! This week I visit UBC and[…]
I finally finished coding the random walk algorithm with all additional features we’ve discussed (so far!) I found out this past week that I’ve been accepted to quite a few PhD programs, as well as an internship at the National Renewable Energy Laboratory with Dr. Kristi Potter! These next few weeks will be tough, I have back-to-back grad visits the next month and we’re still planning for the March 31st Vis deadline. I find out on the 21st if our paper to EuroVis was accepted, and depending on that we’ll figure out how to approach this next paper. Looking forward to this next month! An example of our random walk algorithm with 500 random walks completed. Node size reflects the[…]
This week we finished coding an additional method for identifying cycles. This method was suggested during a meeting I was luckily able to attend last year during the Vis conference. Carlos (one of our collaborators) suggested we use triangles in the graph to find additional walks, which we’ve finished using javaplex. My method is mostly finished as well, I’m now adding the ability to do many walks at a time where a walk ends if the node has already been visited. This has been a bit difficult because I cannot let nodes between the start point and revisited point count towards the overall probability. Next week we finish both of our methods and look to implement them together. Featured image[…]
This past week I’ve worked on fixing the random walk algorithm. Right now the algorithm can run clockwise as long as the starting node is one of the 10 lowest ranked nodes. This week I’m working on making any node able to be the starting node. I’m also going to be testing new datasets when possible, as of now we are only testing for single cycles but I already have two additional datasets with two cycles and three cycles. Hopefully next week I’ll be able to incorporate those as well.