Difference between revisions of "CV - Work experience"

From Sinfronteras
Jump to: navigation, search
 
(56 intermediate revisions by the same user not shown)
Line 26: Line 26:
  
 
<p style="font-size: 22px; font-family: 'Linux Libertine','Georgia','Times',serif;">
 
<p style="font-size: 22px; font-family: 'Linux Libertine','Georgia','Times',serif;">
2017
+
2022
 
</p>
 
</p>
 
<!--end----------------------------------------------------------------------------->
 
<!--end----------------------------------------------------------------------------->
Line 36: Line 36:
 
'''[https://www.idg.com/idgdirect/ IDG Direct], Ireland'''
 
'''[https://www.idg.com/idgdirect/ IDG Direct], Ireland'''
 
</p>
 
</p>
'''Business Development Executive'''
+
'''Real Time Data Analyst'''
 
</div>
 
</div>
 
<!-- * This is a pre-sales role. I represent IDG services by making professional outgoing calls to prospective clients. I have to establish and maintain a professional conversation with IT Managers to identify their needs and next investments. The gathered information is required from our clients (Largest Tech Companies) and used in the next step of the sales process. -->
 
<!-- * This is a pre-sales role. I represent IDG services by making professional outgoing calls to prospective clients. I have to establish and maintain a professional conversation with IT Managers to identify their needs and next investments. The gathered information is required from our clients (Largest Tech Companies) and used in the next step of the sales process. -->
  
* My responsibilities include:
+
I'm responsible for analyzing and monitoring call center data. This includes call volumes, performance indicators, queue time, agents availability, inactivity levels, average handle times, etc.
 +
 
 +
* '''Python''' programming for data analysis and data visualization: Pandas, Scikit-learn, Plotly, Dash.
 +
 
 +
* Data preparation, Data visualization, Dashboards/reports creation with '''SiSense'''.
 +
 
 +
* Finding patterns and trends in the data to help increase productivity and forecast requirements.
 +
 
 +
* Produce daily, weekly, and monthly internal reports to assist with the creation of metrics and targets for services.
 +
 
 +
* Data Management in Excel.
 +
 
 +
* Generate ideas for process and service improvement.
 +
 
 +
* Work closely with the operations team to analyze and help improve their delivery processes.
 +
 
 +
<!--end----------------------------------------------------------------------------->
 +
|-
 +
<!--end--=========================================================================-->
 +
<!--begin--=======================================================================-->
 +
| style="color: white; width: 12%; background-color: #4081de; padding: 30px 20px 40px 20px; border:1px solid #ddddff; vertical-align:center;" |
 +
<div style="text-align: center; font-weight: 540;">
 +
<!--begin--------------------------------------------------------------------------->
 +
<p style="font-size: 22px; font-family: 'Linux Libertine','Georgia','Times',serif;">
 +
Present
 +
</p>
 +
 
 +
<span style="font-size: 30px; font-weight: bold">
 +
&#8593;
 +
</span>
 +
 
 +
<p style="font-size: 22px; font-family: 'Linux Libertine','Georgia','Times',serif;">
 +
2020
 +
</p>
 +
<!--end----------------------------------------------------------------------------->
 +
</div>
 +
| style="width: 88%; background-color: #ededf2; padding: 30px 20px 40px 20px; border:2px solid #ddddff; vertical-align:top;" |
 +
<!--begin--------------------------------------------------------------------------->
 +
<div style="font-size: 16px;">
 +
<p style="margin-bottom: -0px">
 +
'''Python Developer - Freelance'''
 +
</p>
 +
</div>
 +
<!-- * This is a pre-sales role. I represent IDG services by making professional outgoing calls to prospective clients. I have to establish and maintain a professional conversation with IT Managers to identify their needs and next investments. The gathered information is required from our clients (Largest Tech Companies) and used in the next step of the sales process. -->
 +
 
 +
* Development of crypto trading applications/bots using '''Python''': Historical and real-time analysis of cryptocurrency data to forecast price movements. The data analyzed include prices, volumes, cryptocurrency news (tweets / Announcements of new cryptocurrency listing in Exchanges), etc.
 +
:*
 +
:* Real-time response optimization of crypto news (tweets/web content) through a multi-location server architecture.
 +
:* Centralized management of multiple social media accounts to monitor crypto news/market sentiment.
 +
:* REST API requests (Requests, asyncio/aiohttp). WebSockets client and server.
 +
:* Web Scraping, BeautifulSoup, Selenium, Mysql.connector.
 +
:* Concurrency: Multithreading (threading, ThreadPoolExecutor), Multiprocessing, Event loop (asyncio).
 +
:* '''AWS''' (EC2, VPC, Amazon S3, Amazon Glacier) / GoogleCloud
 +
:* Data analysis and visualization: Pandas, Scikit-learn, Plotly, Dash.
 +
 
 +
* Development of a eComerce Web Application for an optical glasses retailer using '''Python-Django''':
 +
: Visit the Web App at http://www.vglens.sinfronteras.ws
 +
 
 +
<!--end----------------------------------------------------------------------------->
 +
|-
 +
<!--end--=========================================================================-->
 +
<!--begin--=======================================================================-->
 +
| style="color: white; width: 12%; background-color: #4081de; padding: 30px 20px 40px 20px; border:1px solid #ddddff; vertical-align:center;" |
 +
<div style="text-align: center; font-weight: 540;">
 +
<!--begin--------------------------------------------------------------------------->
 +
<p style="font-size: 22px; font-family: 'Linux Libertine','Georgia','Times',serif;">
 +
2022
 +
</p>
 +
 
 +
<span style="font-size: 30px; font-weight: bold">
 +
&#8593;
 +
</span>
  
:* Lead Generations:  
+
<p style="font-size: 22px; font-family: 'Linux Libertine','Georgia','Times',serif;">
::* I represent IDG services by making professional outgoing calls to prospective clients. I have to establish and maintain a professional conversation with IT Managers to identify their needs and next investments. The gathered information is required from our clients (Largest Tech Companies) and used in the next step of the sales process.
+
2017
 +
</p>
 +
<!--end----------------------------------------------------------------------------->
 +
</div>
 +
| style="width: 88%; background-color: #ededf2; padding: 30px 20px 40px 20px; border:2px solid #ddddff; vertical-align:top;" |
 +
<!--begin--------------------------------------------------------------------------->
 +
<div style="font-size: 16px;">
 +
<p style="margin-bottom: -10px">
 +
'''[https://www.idg.com/idgdirect/ IDG Direct], Ireland'''
 +
</p>
 +
'''Senior Business Development Executive'''
 +
</div>
 +
<!-- * This is a pre-sales role. I represent IDG services by making professional outgoing calls to prospective clients. I have to establish and maintain a professional conversation with IT Managers to identify their needs and next investments. The gathered information is required from our clients (Largest Tech Companies) and used in the next step of the sales process. -->
  
::* I work in different markets and contact clients in French, English and Spanish: France, Belgium, Luxembourg, Spain, Middle East and Africa.
+
I am responsible for creating new sales prospects and carrying out b2b outgoing calls to prospective clients. Additionally, I prioritize staying well-informed about each customer's current and future business practices and processes, allowing me to build and maintain strong relationships with them.  
  
::* Gathering client details and Maintaining/Updating IDG database with accurate client details.
+
Building and maintaining a professional relationship with IT Managers is a key part of my role. This enables me to identify their needs and next investments. The gathered information is required by our clients (Largest Tech Companies) and used in the next step of the sales process.
  
:* Train new team members and provide ongoing training:
+
* I work in different markets and contact clients in French, English, and Spanish: France, Belgium, Luxembourg, Spain, Middle East, and Africa.
::* Motivating, developing, and training staff at a BANT level.
 
::* Script delivery and how to effectively communicate.
 
::* How to construct specific campaign questions that encourage the prospect to expand on information.
 
  
:* Liaise and work closely with the Client Manager, Team Leaders, and all key stakeholders involved in the team set-up.
+
* Gathering client details and Maintaining/Updating IDG database with accurate client details.
  
:* Identifying any business need or potential investment a prospect may have that can be potentially capitalized on by our clients.
+
* Train new team members and provide ongoing training:
 +
:* Motivating, developing, and training staff at a BANT level.
 +
:* Script delivery and how to effectively communicate.
 +
:* How to construct specific campaign questions that encourage the prospect to expand on information.
  
:* Collaborating with the Team Manager in the recruitment process of new agents.
+
* Monitoring team members on a daily basis, providing support and encouragement where necessary to ensure all service levels and KPIs are reached.
  
:* Ensure Key Performance Indicators (KPIs) are met and constantly improved.
+
* Ensure Key Performance Indicators (KPIs) are met and constantly improved.
  
:* Monitor staff on a daily basis, providing development and encouragement where necessary to ensure all service levels and KPIs are reached.
+
* Collaborating with the Team Manager in the recruitment process of new agents.
  
 +
* I have experience leading campaigns. Liaise and work closely with the Client Manager, Team Leaders, and all key stakeholders involved in the team set-up.
  
* In this position, I have improved my communication skills in French and English. I have learned how to build and maintain a professional relationship with clients and improved my Active Listening Skills.
+
* Identifying any business need or potential investment a prospect may have that can be potentially capitalized on by our clients.
  
  
* As part of my professional development at IDG, I have completed a Certified Sales training. This course addressed the most important aspects of the sales process.
+
<!-- * In this position, I have improved my communication skills in French and English. I have developed rapport-building skills and I'm able to quickly build trust with a range of stakeholders.
 +
I have learned how to build and maintain a professional relationship with clients and improved my Active Listening Skills. -->
 +
<!-- * As part of my professional development at IDG, I have completed a Certified Sales training. This course addressed the most important aspects of the sales process. -->
 
<div class="oculto mw-collapsible mw-collapsed" data-expandtext="Expand hidden content" data-collapsetext="Collapse hidden content">
 
<div class="oculto mw-collapsible mw-collapsed" data-expandtext="Expand hidden content" data-collapsetext="Collapse hidden content">
 
<br />
 
<br />
Line 94: Line 180:
 
'''Target and KPI'''
 
'''Target and KPI'''
 
* I'm used to work in a Target Working Environment because I'm currently working in a TWE at IDG.
 
* I'm used to work in a Target Working Environment because I'm currently working in a TWE at IDG.
* At IDG we need to generate what we call a «lead». A lead is a conversation that matches the criteria asked for the client. For example, if the client (Let's see IBM) is asking for contacts that are looking to invest in Backup solutions, then every time that we have a conversation in which the contact confirms to be looking for backup solutions; this contact represents a «lead».
+
* At IDG we have to reach a daily target of about €650 per day.
* At IDG we have to reach a daily target of about €650 per day. So each lead that we generated has a price, and we need to generate as many leads as needed to reach the target of €650. So normally an easy lead worth about €65 and a complicated one about €180.
+
* To reach this target performance we need to generate what we call a «lead». A lead is a conversation that matches the criteria asked for the client. For example, if the client (Let's see IBM) is asking for contacts that are looking to invest in Backup solutions, then every time that we have a conversation in which the contact confirms to be looking for backup solutions; this contact represents a «lead».
 +
* So each lead that we generated has a price, and we need to generate as many leads as needed to reach the target of €650. So normally an easy lead worth about €65 and a complicated one about €180.
 
* So, every day we need to fight to reach the target performance. We usually have many challenges to reach the target performance:
 
* So, every day we need to fight to reach the target performance. We usually have many challenges to reach the target performance:
 
:* '''Data challenges:''' We make calls using particular data that has been prepared for a particular campaign. Many times you can make many calls but you don't reach the contacts that you are looking for. So you can spend your day making calls but not having conversations with the IT Manager. So if you are not reaching the contact, you can not make leads.
 
:* '''Data challenges:''' We make calls using particular data that has been prepared for a particular campaign. Many times you can make many calls but you don't reach the contacts that you are looking for. So you can spend your day making calls but not having conversations with the IT Manager. So if you are not reaching the contact, you can not make leads.
Line 130: Line 217:
 
</div>
 
</div>
  
* I was responsible for the installation and administration of a ''Wiki Web Application'' based on the ''MediaWiki'' engine.
+
I was responsible for the installation and administration of a ''Wiki Web Application'' based on the ''MediaWiki'' engine.
 
 
  
 
* Extensive experience with the ''MediaWiki'' Engine:
 
* Extensive experience with the ''MediaWiki'' Engine:
Line 207: Line 293:
 
'''[https://www.topuniversities.com/universities/universidad-simon-bolivar-usb#371830 Simón Bolívar University - Funindes USB], Venezuela'''
 
'''[https://www.topuniversities.com/universities/universidad-simon-bolivar-usb#371830 Simón Bolívar University - Funindes USB], Venezuela'''
 
</p>
 
</p>
'''Research geophysicist of the Parallel and Distributed Systems Group (GRyDs)'''
+
'''Research geophysicist'''
 +
<!-- '''Research geophysicist of the Parallel and Distributed Systems Group (GRyDs)''' -->
 
</div>
 
</div>
 
<p style="margin-top: -10px; font-weight: normal; font-size: 13px;">
 
<p style="margin-top: -10px; font-weight: normal; font-size: 13px;">
Line 214: Line 301:
 
<!-- '''Overview of Role:'''  -->
 
<!-- '''Overview of Role:'''  -->
 
<section begin="research_gryds_1" />
 
<section begin="research_gryds_1" />
* As a Research Geophysicist, I was responsible for performing a set of signal analysis (seismic processing) tasks and ensuring the correct integration and implementation of geophysical applications into a computer cluster platform. This platform was being designed in order to facilitate task scheduling and run Computationally intensive task/highly compute-intensive tasks on clusters. One of my main activities was shell script programming for Seismic Modeling and Processing.
+
As a Research Geophysicist, I was responsible for performing a set of Signal analysis/Data processing tasks, and ensuring the correct integration and implementation of geophysical applications into a computer cluster platform.  
 +
<!-- This platform was being designed in order to facilitate task scheduling and run Computationally intensive tasks/highly compute-intensive tasks on clusters. One of my main activities was shell script programming for Seismic Modeling and Processing. -->
 
<section end="research_gryds_1" />
 
<section end="research_gryds_1" />
 
+
* My responsibilities included:
 
+
:* '''Machine Learning algorithms (Regression, classification)''' for Seismic/Borehole data Analysis.
* My responsibilities include:
+
:* '''Python / MATLAB / Shell script programming''' for Seismic data analysis/Signal analysis (Seismic data processing and modeling).
:* Shell script / MATLAB programming for signal analysis (seismic data processing and modeling).
 
 
<!-- :* Programming of Seismic Modelling/Processing Applications using MATLAB and Scilab. -->
 
<!-- :* Programming of Seismic Modelling/Processing Applications using MATLAB and Scilab. -->
 
:* Simulations of seismic waves propagation: Wavefront and ray tracing.
 
:* Simulations of seismic waves propagation: Wavefront and ray tracing.
Line 236: Line 323:
  
 
<section begin="research_gryds_3" />
 
<section begin="research_gryds_3" />
I have skills in Matlab, Scilab and Shell scripting that I got during my participation in an R&D Unit at Simón Bolívar University (The Parallel and Distributed Systems Group - GryDs).
+
I have skills in Matlab, Scilab, Python and Shell scripting that I got during my participation in an R&D Unit at Simón Bolívar University (The Parallel and Distributed Systems Group - GryDs).
  
 
MATLAB (matrix laboratory) is a language and numerical computing environment. MATLAB allows data analysis and data visualization, matrix manipulations, and performing numerical computations. Matlab contains a huge library of functions that facilitate the resolution of many mathematical and engineering problems. For example, I used it for Signal Analysis, specifically for Seismic data analysis. it for Ex. 1 and Ex. 2:
 
MATLAB (matrix laboratory) is a language and numerical computing environment. MATLAB allows data analysis and data visualization, matrix manipulations, and performing numerical computations. Matlab contains a huge library of functions that facilitate the resolution of many mathematical and engineering problems. For example, I used it for Signal Analysis, specifically for Seismic data analysis. it for Ex. 1 and Ex. 2:
Line 275: Line 362:
 
</div>
 
</div>
  
* Demultiplexing, Reformatting (SEG -Y/SEG -D).
+
I was responsible for performing Seismic Data processing/Analysis and Borehole Data Analysis for oil and gas exploration.
* Seismic data edition: Searchin for noisy, monofrequency and incorrect polarities traces.
+
 
* Geometrical spreading correction. Set-up of field geometry.
+
* '''Seismic/Borehole data Analysis: Machine Learning algorithms (Regression, classification) for estimating reservoir properties / reservoirs classification.'''
* Geometry QC.
+
 
* Application of field statics corrections, Deconvolution, trace balancing.
+
* 2D/3D Seismic data processing:
* CMP sorting, Velocity analysis, Residual statics corrections.
+
<!-- ::* Demultiplexing, Reformatting (SEG -Y/SEG -D). -->
* NMO Correction, Muting, Stacking, Filtering.
+
<!-- ::* Seismic data edition: Searchin for noisy, monofrequency and incorrect polarities traces. -->
* Filtering: Time-variant, band-pass.
+
:* Geometrical spreading correction. Set-up of field geometry.
* Post-stack/Pre-stack time and depth migration.
+
<!-- :* Geometry QC. -->
 +
:* Application of field statics corrections, Deconvolution, trace balancing.
 +
:* CMP sorting, Velocity analysis, Residual statics corrections.
 +
:* NMO Correction, Muting, Stacking.
 +
:* Filtering: Time-variant, band-pass.
 +
:* Post-stack/Pre-stack time and depth migration.
 +
 
 +
* Numerical modeling of seismic wave propagation.
 +
 
 
<!--end----------------------------------------------------------------------------->
 
<!--end----------------------------------------------------------------------------->
 
|-
 
|-
Line 310: Line 405:
 
'''[https://www.topuniversities.com/universities/universidad-simon-bolivar-usb#371830 Simón Bolívar University], Venezuela'''
 
'''[https://www.topuniversities.com/universities/universidad-simon-bolivar-usb#371830 Simón Bolívar University], Venezuela'''
 
</p>
 
</p>
'''Academic Assistant - Earth Sciences Department'''
+
'''Academic Assistant - Geophysics Department'''
 
</div>
 
</div>
 
<section begin="academic_assistant_1" />
 
<section begin="academic_assistant_1" />
* As a Academic Assistant, I was in charge of collaborating with the lecture by teaching some modules of the Geophysical Engineering program at Simón Bolívar University. I was usually in charge of a group between 20 and 30 students during theoretical and practical activities.
+
As an Academic Assistant, I was in charge of collaborating with the lecture by teaching some modules of the Geophysical Engineering program at Simón Bolívar University. I was usually in charge of a group of between 20 and 30 students during theoretical and practical activities.
  
 +
* Courses taught:
 +
:* '''Seismic data processing''': Concepts of discrete signal analysis ('''time series analysis'''), sampling, aliasing, and discrete Fourier transform. Conventional seismic data processing sequence.
 +
:* Seismic methods: The convolutional model of the seismic trace. Propagation and attenuation of seismic waves. Interpretation of seismic sections.
 +
:* Seismic reservoir characterization: Relations between the acoustic impedance and the petrophysical parameters. Well-Seismic Ties. '''Seismic data analysis''' (Inversion and AVO).
  
 
* This experience has contributed to my professional development in two major areas:
 
* This experience has contributed to my professional development in two major areas:
:* By teaching modules, I have solidified many technical geophysical knowledge.
+
:* By teaching modules, I have enhanced my technical geophysical knowledge.
 
:* I have also developed communication and presentation skills, as well as the leadership strategies needed to manage a group of students and to transfer knowledge effectively.
 
:* I have also developed communication and presentation skills, as well as the leadership strategies needed to manage a group of students and to transfer knowledge effectively.
 
<section end="academic_assistant_1" />
 
<section end="academic_assistant_1" />
 
* Courses taught:
 
:* Seismic data processing: Concepts of discrete signal analysis, sampling, aliasing and discrete Fourier transform. Conventional seismic data processing sequence.
 
:* Seismic methods: The convolutional model of the seismic trace. Propagation and attenuation of seismic waves. Interpretation of seismic sections.
 
:* Seismic reservoir characterization: Relations between the acoustic impedance and the petrophysical parameters. Well-Seismic Ties. Seismic inversion and AVO.
 
  
 
<div class="oculto mw-collapsible mw-collapsed" data-expandtext="Expand hidden content" data-collapsetext="Collapse hidden content">
 
<div class="oculto mw-collapsible mw-collapsed" data-expandtext="Expand hidden content" data-collapsetext="Collapse hidden content">

Latest revision as of 15:01, 14 June 2023





Present

2022

IDG Direct, Ireland

Real Time Data Analyst

I'm responsible for analyzing and monitoring call center data. This includes call volumes, performance indicators, queue time, agents availability, inactivity levels, average handle times, etc.

  • Python programming for data analysis and data visualization: Pandas, Scikit-learn, Plotly, Dash.
  • Data preparation, Data visualization, Dashboards/reports creation with SiSense.
  • Finding patterns and trends in the data to help increase productivity and forecast requirements.
  • Produce daily, weekly, and monthly internal reports to assist with the creation of metrics and targets for services.
  • Data Management in Excel.
  • Generate ideas for process and service improvement.
  • Work closely with the operations team to analyze and help improve their delivery processes.

Present

2020

Python Developer - Freelance

  • Development of crypto trading applications/bots using Python: Historical and real-time analysis of cryptocurrency data to forecast price movements. The data analyzed include prices, volumes, cryptocurrency news (tweets / Announcements of new cryptocurrency listing in Exchanges), etc.
  • Real-time response optimization of crypto news (tweets/web content) through a multi-location server architecture.
  • Centralized management of multiple social media accounts to monitor crypto news/market sentiment.
  • REST API requests (Requests, asyncio/aiohttp). WebSockets client and server.
  • Web Scraping, BeautifulSoup, Selenium, Mysql.connector.
  • Concurrency: Multithreading (threading, ThreadPoolExecutor), Multiprocessing, Event loop (asyncio).
  • AWS (EC2, VPC, Amazon S3, Amazon Glacier) / GoogleCloud
  • Data analysis and visualization: Pandas, Scikit-learn, Plotly, Dash.
  • Development of a eComerce Web Application for an optical glasses retailer using Python-Django:
Visit the Web App at http://www.vglens.sinfronteras.ws

2022

2017

IDG Direct, Ireland

Senior Business Development Executive

I am responsible for creating new sales prospects and carrying out b2b outgoing calls to prospective clients. Additionally, I prioritize staying well-informed about each customer's current and future business practices and processes, allowing me to build and maintain strong relationships with them.

Building and maintaining a professional relationship with IT Managers is a key part of my role. This enables me to identify their needs and next investments. The gathered information is required by our clients (Largest Tech Companies) and used in the next step of the sales process.

  • I work in different markets and contact clients in French, English, and Spanish: France, Belgium, Luxembourg, Spain, Middle East, and Africa.
  • Gathering client details and Maintaining/Updating IDG database with accurate client details.
  • Train new team members and provide ongoing training:
  • Motivating, developing, and training staff at a BANT level.
  • Script delivery and how to effectively communicate.
  • How to construct specific campaign questions that encourage the prospect to expand on information.
  • Monitoring team members on a daily basis, providing support and encouragement where necessary to ensure all service levels and KPIs are reached.
  • Ensure Key Performance Indicators (KPIs) are met and constantly improved.
  • Collaborating with the Team Manager in the recruitment process of new agents.
  • I have experience leading campaigns. Liaise and work closely with the Client Manager, Team Leaders, and all key stakeholders involved in the team set-up.
  • Identifying any business need or potential investment a prospect may have that can be potentially capitalized on by our clients.



Communication and Sale Skills

  • My current job at IDG is about communication. First, because I'm working in a team, and we always have to reach targets as a team, and communication within the team is always the key to reach the targets. Secondly, because one of my main responsibilities is to call contacts (to call IT Managers) on behalf of our clients, and of course this is about effective and clear communication. I have to explain to the contact the reason for the call, the topic of the campaign, and most importantly, I have to communicate in a way that... well I have to create an atmosphere in the call where the contact is going to feel comfortable and is going to accept answering my questions.
  • In this position, I have improved my communication skills in French and English. I have learned how to build and maintain a professional relationship and improved my Active Listening Skills.
  • I also think that I have developed communication skills not only at work but also in other aspects of my life; you know I have always done team sports in a high competitive-level: Volleyball when I was a chield; I was member of the Volleyball team of my state and attended 1 national games; and Waterpolo at university, where I attended 5 National University Games; and those are activities where you develop, sometimes without being aware, you develop many communication skills.
  • I have to call IT Managers and establish and maintain a professional conversation with them in order to identify their next investments. So from this conversation we gather information about their next investment and this information is required from our clients (IT Companies: IBM, DELL, NetApp, etc) and they use this information next step of the sales process.
  • Let's say that IBM is looking to sell a particular product (A Cloud backup solution, for example). So, IBM requires IDG's services, asking for a number of contacts (IT Managers) that are planning to invest in backup solutions. Then, we establish a professional conversation with IT Managers from our database and identify those that are looking to invest in the product required for the client.
  • During the phone conversations, I have to explain the topic of the product that our clients are looking to sell and be able to handle objections. That is why this experience has made me aware of the latest solutions and technologies in which the most important IT companies are working on.
  • At IDG, I have also completed a Certified Sales training. During this course, I have learned and put into practice, the most important concepts of the sales process.
  • Prospecting, Preparation, Approach, Presentation, Handling objections, Closing, Follow-up
https://www.lucidchart.com/blog/what-is-the-7-step-sales-process


Target and KPI

  • I'm used to work in a Target Working Environment because I'm currently working in a TWE at IDG.
  • At IDG we have to reach a daily target of about €650 per day.
  • To reach this target performance we need to generate what we call a «lead». A lead is a conversation that matches the criteria asked for the client. For example, if the client (Let's see IBM) is asking for contacts that are looking to invest in Backup solutions, then every time that we have a conversation in which the contact confirms to be looking for backup solutions; this contact represents a «lead».
  • So each lead that we generated has a price, and we need to generate as many leads as needed to reach the target of €650. So normally an easy lead worth about €65 and a complicated one about €180.
  • So, every day we need to fight to reach the target performance. We usually have many challenges to reach the target performance:
  • Data challenges: We make calls using particular data that has been prepared for a particular campaign. Many times you can make many calls but you don't reach the contacts that you are looking for. So you can spend your day making calls but not having conversations with the IT Manager. So if you are not reaching the contact, you can not make leads.
  • Hard campaign challenges: That means that we have a campaign in which the client is asking for a difficult criterion. Let's say, for example, that the client is asking for contacts that are looking to invest in a particular solution (SAP applications for example). That represents a campaign challenge because we have to reach a contact that is looking to invest, specifically, in this solution.
  • Solutions: There are a few techniques that we use to apply when we face the challenges. Change the data or the campaign you're working on is the first action we can take. But sometimes you can not change the campaign because we really need to deliver lead for those campaigns because we need to reach a certain number of leads the client is asking for. We usually make calls using a platform that makes the calls automatically taking the contact from the database related to the campaign you're working on. So usually we don't need to worry about the criteria (company size, job title, industry) of the contacts we are calling because the platform makes the calls. But when you have data problems, the solution is to research for contacts manually. So, that is a little tricky because you can try to call the best contact by doing manual research in the database, but you can spend a long time doing this research and that doesn't assure that you are going to reach the contact and get leads. So when you have good data you have to use the platform, otherwise, you should search for contacts manually. So in this manual research is where you have to propose ideas and develop a good methodology to be able to find good contacts and get leads. One of the techniques we apply when we have a hard campaign is, for example, if we get a lead from a particular company; we try to call other contacts from the same company because we know that this particular company is going to review in the product that the client is looking for.
The other approach is to try to search new contacts on the internet (usually on Linkedin), but that is even more tricky because it is complicated to get reach a new contact and to get the lead. Here is where I wanted to say that I had an important contribution. So the problem with this external research is that most of the contact that you are going to find on Linkedin is already in our database. So it doesn't make sense. But I realized that when we are looking for business job titles (because sometimes we have campaigns in which the client is asking for business titles) it makes sense to do external research (on Linkedin) because our database is composed mostly for IT Professionals (we have some business contacts in our database, but not a lot) so the chance of finding a contact on Linkedin that is not in our database increase a lot. Therefore, it makes sense to do external research when looking for business contacts. By doing that, I was able to get a good number of leads for hard campaigns; and that is a concrete contribution that I made to my team.

2014

WikiVox, France

Web Programmer

I was responsible for the installation and administration of a Wiki Web Application based on the MediaWiki engine.

  • Extensive experience with the MediaWiki Engine:
  • Configuration of a Multilingual Wiki.
  • User access levels configuration.
  • Implementation of different CAPTCHA methods.
  • Implementation of a payment gateway.
  • Page categorization.
  • Take a look at my personal Wiki: http://wiki.sinfronteras.ws
  • Administration of a Linux Server:
  • Installation and configuration of a LAMP stack: Apache, MySQL, PHP.
  • Database management:
  • MySQL, PhpMyAdmin.


WikiVox is a nonprofit organization whose goal is to create a website (a wiki) for debates of political, economic and environmental topics. They want to create a discussion method capable to generate, at some point in the debate, an article with precise suggestions, in order to contribute to the solution to the problem.

When I was working at WikiVox, the project was just starting. The philosophy of the project was already mature, but the implementation of the Wiki was just in its first phase.

It was a very nice experience. I liked very much especially the philosophy of the project.

And... I think that working in a small organization was positive at this point in my career Because I had responsibilities that I am sure I would not have had in a big company; that's why I think that I learned a lot from them.

I had responsibilities related to (1) the administration of a Linux Web Server and (2) to the design of the website.

  • About Linux administration, my responsabilities were regarding the installation and administration of a LAMP stack (Apache, MySQL, PHP) on a Linux Server.
  • About the design of the website, we used free software (Wikipedia Software). I was responsible for the installation and administration of a Wiki Web Application based on the MediaWiki engine. Some of the functionalities that we
  • We had to install a LanguageSelector and translate the content into 5 languages: French, English, Spanish, German and Arabic.
  • We had to install an extension to make donations (I mean to pay online). The payment gateway for implementing a donation service.
  • An extension to categorize pages.
  • I also had to program in PHP.


Wiki - Organize information into a cohesive, searchable and maintainable system.

  • One of the most important skills I have, which I usually find complicated to make understand its importance, is my Wiki management skills.
  • A Wiki is a website on which users can collaborate by creating and modifying content from the web browser. So, the best example is Wikipedia. In Wikipedia someone can create a article and then it can be modify online for other users. A Wiki is an outstanding tool to organize information into a cohesive, searchable and maintainable system that can be accessed and modified online. The benefits of a wiki to organize information are remarkable.
I have a personal Wiki (based on the MediaWiki engine) where I document everything I'm learning and working on. So, I use a Wiki as a Personal knowledge management that allows me to organize information into a cohesive, searchable and maintainable system. The benefits that I've had using a Wiki are amazing. It has allowed me to learn in a more effective way; and most importantly, to constantly review and improve in important topics by providing a very convenient online access (so from anywhere) to an organized and structured information.
Take a look at some of my Wiki pages: http://perso.sinfronteras.ws/index.php/Computer_Science_and_IT

2012

2011

Simón Bolívar University - Funindes USB, Venezuela

Research geophysicist

Click here to see some examples of my work in Seismic modelling.

As a Research Geophysicist, I was responsible for performing a set of Signal analysis/Data processing tasks, and ensuring the correct integration and implementation of geophysical applications into a computer cluster platform.

  • My responsibilities included:
  • Machine Learning algorithms (Regression, classification) for Seismic/Borehole data Analysis.
  • Python / MATLAB / Shell script programming for Seismic data analysis/Signal analysis (Seismic data processing and modeling).
  • Simulations of seismic waves propagation: Wavefront and ray tracing.
  • Generation of pre-stacked synthetic seismic data using wave propagation theories (raytracing and finite difference methods).
  • 2D/3D Seismic data processing:
  • Deconvolution
  • Auto-correlation, Cross-correlation
  • Analysis of signal noise reduction: time/frequency domain transforms
  • Task automation using Shell scripting.


Task automation using Shell scripting: Here I could mention the generation of images to create seismic waves propagation videos or the automatic generation of pdf reports using latex that contained details about the executed process: time vs. the features of the data generated (the amount of data generated).


I have skills in Matlab, Scilab, Python and Shell scripting that I got during my participation in an R&D Unit at Simón Bolívar University (The Parallel and Distributed Systems Group - GryDs).

MATLAB (matrix laboratory) is a language and numerical computing environment. MATLAB allows data analysis and data visualization, matrix manipulations, and performing numerical computations. Matlab contains a huge library of functions that facilitate the resolution of many mathematical and engineering problems. For example, I used it for Signal Analysis, specifically for Seismic data analysis. it for Ex. 1 and Ex. 2:

  • Signal Processing in Geophysics
  • Ex.1: That allows defining the coordinates of the layers of a geological model by opening an image file of the geological model and selecting, by clicking with the mouse, a set of points (or coordinates) that define each of the layers of the geological model. These coordinates will be saved in a very particular format that will be used as input of another program that is in charge of building the Geological model entity used by another program to perform a Seismic Wave Propagation Modelling.

2011

2010

CGGVeritas, Venezuela

Seismic data processing analyst

I was responsible for performing Seismic Data processing/Analysis and Borehole Data Analysis for oil and gas exploration.

  • Seismic/Borehole data Analysis: Machine Learning algorithms (Regression, classification) for estimating reservoir properties / reservoirs classification.
  • 2D/3D Seismic data processing:
  • Geometrical spreading correction. Set-up of field geometry.
  • Application of field statics corrections, Deconvolution, trace balancing.
  • CMP sorting, Velocity analysis, Residual statics corrections.
  • NMO Correction, Muting, Stacking.
  • Filtering: Time-variant, band-pass.
  • Post-stack/Pre-stack time and depth migration.
  • Numerical modeling of seismic wave propagation.

2010

2008

Simón Bolívar University, Venezuela

Academic Assistant - Geophysics Department

As an Academic Assistant, I was in charge of collaborating with the lecture by teaching some modules of the Geophysical Engineering program at Simón Bolívar University. I was usually in charge of a group of between 20 and 30 students during theoretical and practical activities.

  • Courses taught:
  • Seismic data processing: Concepts of discrete signal analysis (time series analysis), sampling, aliasing, and discrete Fourier transform. Conventional seismic data processing sequence.
  • Seismic methods: The convolutional model of the seismic trace. Propagation and attenuation of seismic waves. Interpretation of seismic sections.
  • Seismic reservoir characterization: Relations between the acoustic impedance and the petrophysical parameters. Well-Seismic Ties. Seismic data analysis (Inversion and AVO).
  • This experience has contributed to my professional development in two major areas:
  • By teaching modules, I have enhanced my technical geophysical knowledge.
  • I have also developed communication and presentation skills, as well as the leadership strategies needed to manage a group of students and to transfer knowledge effectively.



I have three years of experience as an academic assistant in the courses of Seismic Processing, Seismic Reservoir Characterization, and Seismic Methods.

During my experience as an academic assistant, I have solidified my knowledge of the theoretical basis of seismic processing. In particular, all the technical concepts that are required for this position, such as Seismic velocity analysis, Multiples, Surface statistics correction, Noise attenuation, and Imaging.

During my experience as a teacher assistant, I was assigned three times to teach the Seismic data processing course. My work was to give theoretical and practical lessons. The theoretical part was focused on signal theory: Concepts of discrete signal analysis, sampling, aliasing, and discrete Fourier transform, and all the theoretical aspects of each stage of a conventional seismic processing sequence. And in the practical part, the students had to process a 2D seismic data set. We used the Seismic Unix software. It's a free software developed for the Colorado School of Mines.

I was the assistant of the teacher in charge. But I was responsible for a large part of the course since I have participated three times in this course.