Professional Documents
Culture Documents
20/11/2010 Saturday
21/11/2010 Sunday
24/11/2010 8hrs scraping tried to get response from a page without hard
27/11/2010 Saturday
28/11/2010 Sunday
Month December
Date No.Of.Hrs Subject Description
29/11/2010 8hrs scraping analysed indian airlines. Got the output of via a
9/12/2010 8hrs scraping used session object and prevented logging for
11/12/2010 Saturday
12/12/2010 Sunday
19/12/2010 Sunday
25/12/2010 Saturday
26/12/2010 Sunday
8/1/2011 saturday
9/1/2011 sunday
get response from a page without hardcoding values from httpfox and studied web service
d httpfox 5 version and analysed the raw responses for errors like “operation timeout error”.
ssion object and prevented logging for every time while running code.
Description
scraped via as 1.write sessionid from login page in a mysql table 2.retrieved time_insert from this
and calculated time difference 3.According to time difference session id from login page is taken.4
to know the usage of datediff, abstraction class, get to know mysql table and dotnet application
simultaneous usage.
scraped via using sub_date function in mysql. And tried to store mysqlconnection in webconfig fil
scraped springtravels and analysed json array response
scraped springtravels and tried to pick indivudual data from the response
prepared documentation for upto data scrape details. And scraped spring travels
Description
scraped spring travels--changed the response to be a text of some format instead of xml. And, trie
Learned how to use mysqldump. Tried via on using different useragents for the same request. Trie
Stored jetairways flights shedule excel in Mysql database. And tried to store a table in cache in do
Description
I have taken mysql dump of flights_shedules of websites: 1. flykingfisher
2. goair airlines 3. indian airlines 4. indigo 5. jetairways 6. jetlite 7. spicejet
tried to convert all shedules table into one table and combined shedules for jetairways,jetlite and g
scraped makemytrip.com and understood how to login putty and go to mysql command in that.
used xmlnode,xmldocument,xmlnodelist in code. Fetched data from neptune database for the que
Description
scraped amadeus.net by creating all the origin,destination combinations in excel and included the
Description
took neptune reports like fund transversals, fund reversals, credit notes,debit notes
queries of harihar.
d the excel sheet in code. And write the results in mysql database.
ger. Parsed xml document instead of html document.