Wednesday, July 23, 2014

Spark play with HBase's Result object: handling HBase KeyValue and ByteArray in Scala with Spark -- Real World Examples

This is second part of "Lighting a Spark With HBase Full Edition"

you should read the previous part about HBase dependencies, and spark classpaths first: http://www.abcn.net/2014/07/lighting-spark-with-hbase-full-edition.html

and you'd better read this for some background knowledge about combining HBase and Spark: http://www.vidyasource.com/blog/Programming/Scala/Java/Data/Hadoop/Analytics/2014/01/25/lighting-a-spark-with-hbase

this post aims to provide some additional complicated real world examples of above post.

at first, you can put your hbase-site.xml into spark's conf folder, otherwise you have to specify the full path (absolute path) of hbase-site.xml in your code.
ln -s /etc/hbase/conf/hbase-site.xml $SPARK_HOME/conf/

now, we use a very simple HBase table with string rowkey and string values to warm up.

table contents:
hbase(main):001:0> scan 'tmp'
ROW                   COLUMN+CELL
 abc                  column=cf:test, timestamp=1401466636075, value=789
 abc                  column=cf:val, timestamp=1401466435722, value=789
 bar                  column=cf:val, timestamp=1396648974135, value=bb
 sku_2                column=cf:val, timestamp=1401464467396, value=999
 test                 column=cf:val, timestamp=1396649021478, value=bb
 tmp                  column=cf:val, timestamp=1401466616160, value=test

in the post from vidyasource.com we can find how to get values from HBase Result's tuple, but no keys.

following code shows how to create a RDD of key-value pairs RDD[(key, value)] from HBase Results:
import org.apache.hadoop.hbase.{HBaseConfiguration, HTableDescriptor}
import org.apache.hadoop.hbase.mapreduce.TableInputFormat

import org.apache.spark.rdd.NewHadoopRDD

val conf = HBaseConfiguration.create()
conf.set(TableInputFormat.INPUT_TABLE, "tmp")

var hBaseRDD = sc.newAPIHadoopRDD(conf, classOf[TableInputFormat], classOf[org.apache.hadoop.hbase.io.ImmutableBytesWritable], classOf[org.apache.hadoop.hbase.client.Result])

hBaseRDD.map(tuple => tuple._2).map(result => (result.getRow, result.getColumn("cf".getBytes(), "val".getBytes()))).map(row => {
(
  row._1.map(_.toChar).mkString,
  row._2.asScala.reduceLeft {
    (a, b) => if (a.getTimestamp > b.getTimestamp) a else b
  }.getValue
)
}).take(10)
you will get
Array[(String, Array[Byte])] = Array((abc,Array(55, 56, 57)), (bar,Array(98, 98)), (sku_2,Array(57, 57, 57)), (test,Array(98, 98)), (tmp,Array(116, 101, 115, 116)))

in scala, we can use map(_.toChar).mkString to convert Array[Byte] to a string (because we said, in this warm up example, the HBase table has only string values)
hBaseRDD.map(tuple => tuple._2).map(result => (result.getRow, result.getColumn("cf".getBytes(), "val".getBytes()))).map(row => {
(
  row._1.map(_.toChar).mkString,
  row._2.asScala.reduceLeft {
    (a, b) => if (a.getTimestamp > b.getTimestamp) a else b
  }.getValue.map(_.toChar).mkString
)
}).take(10)
then we get
Array[(String, String)] = Array((abc,789), (bar,bb), (sku_2,999), (test,bb), (tmp,test))
=======================================================================

after warm up, let us take a complicated HBase table example:

this table stores the UUID/cookie or whatever of user's different devices, you can image this table is a part of some kind of platform which is used for cross device user tracking and/or analyzing user behavior on different devices.

userid as rowkey, is string (such as some kind of hashed value)
column family is d (device family)
column qualifiers are the name or id of device (such as some internal id of User Agent Strings, in this example we use some simple string like app1, app2 for mobile apps, pc1, ios2 for different browser on different devices)
value of row is an 8 bytes long (a ByteArray with length 8)

it looks like this:
hbase(main):001:0> scan 'test1'
ROW                   COLUMN+CELL
 user1                column=lf:app1, timestamp=1401645690042, value=\x00\x00\x00\x00\x00\x00\x00\x0F
 user1                column=lf:app2, timestamp=1401645690093, value=\x00\x00\x00\x00\x00\x00\x00\x10
 user2                column=lf:app1, timestamp=1401645690142, value=\x00\x00\x00\x00\x00\x00\x00\x11
 user2                column=lf:pc1,  timestamp=1401645690170, value=\x00\x00\x00\x00\x00\x00\x00\x12
 user3                column=lf:ios2, timestamp=1401645690180, value=\x00\x00\x00\x00\x00\x00\x00\x02

to create such a table, you should put like this in hbase shell
put 'test1', 'user1', 'lf:app1', "\x00\x00\x00\x00\x00\x00\x00\x0F"
put 'test1', 'user1', 'lf:app2', "\x00\x00\x00\x00\x00\x00\x00\x10"
put 'test1', 'user2', 'lf:app1', "\x00\x00\x00\x00\x00\x00\x00\x11"
put 'test1', 'user2', 'lf:pc1',  "\x00\x00\x00\x00\x00\x00\x00\x12"
put 'test1', 'user3', 'lf:ios2', "\x00\x00\x00\x00\x00\x00\x00\x02"

ok, then, how can we read/scan this table from spark?

let us see this code:
conf.set(TableInputFormat.INPUT_TABLE, "test1")

var hBaseRDD = sc.newAPIHadoopRDD(conf, classOf[TableInputFormat], classOf[org.apache.hadoop.hbase.io.ImmutableBytesWritable], classOf[org.apache.hadoop.hbase.client.Result])

hBaseRDD.map(tuple => tuple._2).map(result => (result.getRow, result.getColumn("lf".getBytes(), "app1".getBytes()))).map(row => if (row._2.size > 0) {
(
  row._1.map(_.toChar).mkString,
  row._2.asScala.reduceLeft {
    (a, b) => if (a.getTimestamp > b.getTimestamp) a else b
  }.getValue.map(_.toInt).mkString
)
}).take(10)

why this time it is map(._toInt) ? because in this Array[Byte], those Bytes are numbers, not Chars.

but we get
Array((user1,000000015), (user2,000000017), ())
what? 000000015 ?... yes, because _.toInt convert each element in this Array[Byte] to Int, to avoid this, we can use java.nio.ByteBuffer

this code should be changed to
import java.nio.ByteBuffer
hBaseRDD.map(tuple => tuple._2).map(result => (result.getRow, result.getColumn("lf".getBytes(), "app1".getBytes()))).map(row => if (row._2.size > 0) {
(
  row._1.map(_.toChar).mkString,
  ByteBuffer.wrap(row._2.asScala.reduceLeft {
    (a, b) => if (a.getTimestamp > b.getTimestamp) a else b
  }.getValue).getLong
)
}).take(10)
then we get
Array((user1,15), (user2,17), ())
finally looked better, but what is the last () ?!...

it is because rowkey user3 has no value with column lf:app1, so, again, we can do it better! in HBaseConfiguration object we can set TableInputFormat.SCAN_COLUMNS to a particular column qualifier, so we change the code to FINAL EDITION...
import org.apache.hadoop.hbase.{HBaseConfiguration, HTableDescriptor}
import org.apache.hadoop.hbase.mapreduce.TableInputFormat
import org.apache.spark.rdd.NewHadoopRDD

val conf = HBaseConfiguration.create()
conf.set(TableInputFormat.INPUT_TABLE, "test1")
conf.set(TableInputFormat.SCAN_COLUMNS, "lf:app1")

var hBaseRDD = sc.newAPIHadoopRDD(conf, classOf[TableInputFormat], classOf[org.apache.hadoop.hbase.io.ImmutableBytesWritable], classOf[org.apache.hadoop.hbase.client.Result])

import java.nio.ByteBuffer
hBaseRDD.map(tuple => tuple._2).map(result => {
  ( result.getRow.map(_.toChar).mkString,
    ByteBuffer.wrap(result.value).getLong
  )
}).take(10)

and now, finally we get:
Array[(String, Long)] = Array((user1,15), (user2,17))

=======================================================================

FINAL FULL EDITION

now, if you want to get all of key-value pairs of a HBase table (all versions of values from all of column qualifiers)

you can try this code (for string values table "tmp"):
import org.apache.hadoop.hbase.{HBaseConfiguration, HTableDescriptor}
import org.apache.hadoop.hbase.mapreduce.TableInputFormat

import org.apache.spark.rdd.NewHadoopRDD

import java.nio.ByteBuffer

type HBaseRow = java.util.NavigableMap[Array[Byte],
  java.util.NavigableMap[Array[Byte], java.util.NavigableMap[java.lang.Long, Array[Byte]]]]

type CFTimeseriesRow = Map[Array[Byte], Map[Array[Byte], Map[Long, Array[Byte]]]]

def navMapToMap(navMap: HBaseRow): CFTimeseriesRow =
  navMap.asScala.toMap.map(cf =>
    (cf._1, cf._2.asScala.toMap.map(col =>
      (col._1, col._2.asScala.toMap.map(elem => (elem._1.toLong, elem._2))))))

type CFTimeseriesRowStr = Map[String, Map[String, Map[Long, String]]]

def rowToStrMap(navMap: CFTimeseriesRow): CFTimeseriesRowStr =
  navMap.map(cf =>
    (cf._1.map(_.toChar).mkString, cf._2.map(col =>
      (col._1.map(_.toChar).mkString, col._2.map(elem => (elem._1, elem._2.map(_.toChar).mkString))))))

val conf = HBaseConfiguration.create()
conf.set(TableInputFormat.INPUT_TABLE, "tmp")

val hBaseRDD = sc.newAPIHadoopRDD(conf, classOf[TableInputFormat], classOf[org.apache.hadoop.hbase.io.ImmutableBytesWritable], classOf[org.apache.hadoop.hbase.client.Result])

hBaseRDD.map(kv => (kv._1.get(), navMapToMap(kv._2.getMap))).map(kv => (kv._1.map(_.toChar).mkString, rowToStrMap(kv._2))).take(10)

for long values column family "lf" in table "test1", you can try to define CFTimeseriesRowStr and rowToStrMap as follows:
type CFTimeseriesRowStr = Map[String, Map[String, Map[Long, Long]]]

def rowToStrMap(navMap: CFTimeseriesRow): CFTimeseriesRowStr =
  navMap.map(cf =>
    (cf._1.map(_.toChar).mkString, cf._2.map(col =>
      (col._1.map(_.toChar).mkString, col._2.map(elem => (elem._1, ByteBuffer.wrap(elem._2).getLong))))))


=======================================================================

beyond all of these code, there are more particulars you should think about when you querying HBase table, such as scan cache, enable block cache or not, whether or not to use bloom filters

and most important is, spark is still using org.apache.hadoop.hbase.mapreduce.TableInputFormat  to read from HBase, it is the same as MapReduce program or hive hbase table mapping, so there is a big problem, your job will fail when one of HBase Region for target HBase table is splitting ! because the original region will be offline by splitting.

so if your HBase regions must be splittable, you should be careful to use spark or hive to read from HBase table. maybe you should write coprocessor instead of using hbase.mapreduce API.

if not, you should disable auto region split. following slide summarized all of HBase config properties related to control HBase region split.


109 comments:

  1. Thanks for this! Most examples of how to use Spark with HBase use the example of a query followed by just a .count() call, but don't go into any detail on how to read the actual records.

    ReplyDelete
    Replies
    1. This comment has been removed by a blog administrator.

      Delete
  2. thanks for sharing the information was really helpful ,
    On that not is there a way in which we can read the data from hbase and inset into hive

    ReplyDelete
  3. This is such a great resource that you are providing and you give it away for free. I love seeing blog that understand the value of providing a quality resource for free. psc exam result

    ReplyDelete
  4. If you know what a victory is, then come into the best online casino in the world, you have not seen anything like that, I assure you. to live casino online Play and win with us.

    ReplyDelete
  5. Investigate that states how prescient information examination discovering its way in different ventures:
    Data Analytics Course in Bangalore

    ReplyDelete
  6. this blog that understand the value of providing a quality resource for free thanks for also PSC Result 2019

    ReplyDelete
  7. this post has been of great help to me. thanks. Govt Job

    ReplyDelete
  8. I was taking a gander at some of your posts on this site and I consider this site is truly informational! Keep setting up..
    For more info:
    https://360digitmg.com/course/certification-program-in-data-science
    https://360digitmg.com/course/data-analytics-using-python-r
    https://360digitmg.com/course/data-visualization-using-tableau

    ReplyDelete
  9. Very interesting information and very useful topic. I have information regarding data science course in Chennai.
    data-science training
    Data-Analytics course
    business analytics -Python course

    ReplyDelete
  10. This is my first time i visit here and I found so many interesting stuff in your blog especially it’s discussion. dgfood teletalk com bd 2020.

    ReplyDelete
  11. Hello Friend, Thank you for this outstanding post.
    PSC Result 2020 will be published after few days. If you want to see Full marksheet SSC Result 2020 click here.
    Full Marksheet also can see here at SSC Result 2020
    SSC Result 2020 by SMS

    ReplyDelete
  12. I am looking for and I love to post a comment that "The content of your post is awesome" Great work!

    data science course

    ReplyDelete
  13. I am looking for and I love to post a comment that "The content of your post is awesome" Great work!

    Simple Linear Regression

    Correlation vs Covariance

    ReplyDelete
  14. Took me time to understand all of the comments, but I seriously enjoyed the write-up. It proved being really helpful to me and Im positive to all of the commenters right here! Its constantly nice when you can not only be informed, but also entertained! I am certain you had enjoyable writing this write-up.

    Data Science Course

    ReplyDelete
  15. It is perfect time to make some plans for the future and it is time to be happy. I've read this post and if I could I desire to suggest you some interesting things or suggestions. Perhaps you could write next articles referring to this article. I want to read more things about it!

    Data Science Training

    ReplyDelete
  16. Great post i must say and thanks for the information. Education is definitely a sticky subject. However, is still among the leading topics of our time. I appreciate your post and look forward to more.
    Data Science Institute in Bangalore

    ReplyDelete
  17. Very interesting to read this article.I would like to thank you for the efforts you had made for writing this awesome article. This article inspired me to read more. keep it up.
    Correlation vs Covariance
    Simple linear regression
    data science interview questions

    ReplyDelete

  18. Really nice and interesting post. I was looking for this kind of information and enjoyed reading this one. Keep posting. Thanks for sharing.
    digital marketing course in guduvanchery

    ReplyDelete
  19. I am really enjoying reading your well written articles. It looks like you spend a lot of effort and time on your blog. I have bookmarked it and I am looking forward to reading new articles. Keep up the good work.
    educational course

    ReplyDelete
  20. I feel very grateful that I read this. It is very helpful and very informative and I really learned a lot from it.

    Simple Linear Regression

    Correlation vs covariance

    KNN Algorithm

    Logistic Regression explained

    ReplyDelete
  21. You have provided finicky information for a new blogger so it has turned out to be really obliging. Keep up the good work!
    Data Science training in Mumbai
    Data Science course in Mumbai
    SAP training in Mumbai

    ReplyDelete
  22. Really nice and interesting post. I was looking for this kind of information and enjoyed reading this one. Keep posting. Thanks for sharing.

    data science interview questions

    ReplyDelete
  23. Amazing Article ! I would like to thank you for the efforts you had made for writing this awesome article. This article inspired me to read more. keep it up.
    Simple Linear Regression
    Correlation vs covariance
    data science interview questions
    KNN Algorithm
    Logistic Regression explained

    ReplyDelete
  24. I am looking for and I love to post a comment that "The content of your post is awesome" Great work!

    Simple Linear Regression

    Correlation vs Covariance

    ReplyDelete
  25. very well explained. I would like to thank you for the efforts you had made for writing this awesome article. This article inspired me to read more. keep it up.
    Logistic Regression explained
    Correlation vs Covariance
    Simple Linear Regression
    data science interview questions
    KNN Algorithm

    ReplyDelete
  26. Thanks for sharing this, I actually appreciate you taking the time to share with everybody.
    Best Data Science Course In Hyderabad

    ReplyDelete
  27. "Very good article with very useful information. Visit our websitedata science training in Hyderabad
    "

    ReplyDelete
  28. Neither a transistor nor an artificial neuron could manage itself; but an actual neuron can. data science course in india

    ReplyDelete
  29. The Ministry of Education date fixed to results of the hsc result 2020 will be published by all boards on 28 January 2021. All board students check via sms, online, apps way

    ReplyDelete

  30. Very awesome!!! When I searched for this I found this website at the top of all blogs in search engines.
    Data Science Training in Hyderabad

    ReplyDelete
  31. i am glad to discover this page : i have to thank you for the time i spent on this especially great reading !! i really liked each part and also bookmarked you for new information on your site.
    data science courses in hyderabad

    ReplyDelete
  32. Thanks for posting the best information and the blog is very helpful.data science interview questions and answers

    ReplyDelete
  33. I want to leave a little comment to support and wish you the best of luck.we wish you the best of luck in all your blogging enedevors.
    data science course bangalore

    ReplyDelete
  34. Excellent Blog!!! Waiting for your new blog... thanks for sharing with us.
    AWS Training in Hyderabad
    AWS Course in Hyderabad

    ReplyDelete
  35. Highly appreciable regarding the uniqueness of the content. This perhaps makes the readers feels excited to get stick to the subject. Certainly, the learners would thank the blogger to come up with the innovative content which keeps the readers to be up to date to stand by the competition. Once again nice blog keep it up and keep sharing the content as always.

    data analytics courses in bangalore with placement

    ReplyDelete
  36. Great post i must say and thanks for the information. Education is definitely a sticky subject. However, is still among the leading topics of our time. I appreciate your post and look forward to more.
    Data Science Course in Bangalore

    ReplyDelete
  37. I want to leave a little comment to support and wish you the best of luck.we wish you the best of luck in all your blogging enedevors.
    data analytics courses in bangalore

    ReplyDelete
  38. nice blog!! i hope you will share a blog on Data Science.
    data science courses

    ReplyDelete
  39. I was very pleased to find this site.I wanted to thank you for this great read!! I definitely enjoy every little bit of it and I have you bookmarked to check out new stuff you post.

    business analytics course

    ReplyDelete
  40. I just got to this amazing site not long ago. I was actually captured with the piece of resources you have got here. Big thumbs up for making such wonderful blog page!
    data analytics course in bangalore

    ReplyDelete
  41. National University is published the nu honours 4th year exam result 2021 on online. Students now can check the result from nu.ac.bd/results as well as examresulthub.com

    ReplyDelete
  42. I am a new user of this site, so here I saw several articles and posts published on this site, I am more interested in some of them, hope you will provide more information on these topics in your next articles.
    data analytics training in bangalore

    ReplyDelete
  43. Such a very useful article. Very interesting to read this article.I would like to thank you for the efforts you had made for writing this awesome article.
    Best Data Science courses in Hyderabad

    ReplyDelete
  44. Very nice article, I enjoyed reading your post, very nice share, I want to tweet this to my followers. Thanks!.
    data scientist training and placement

    ReplyDelete
  45. I really enjoy reading and also appreciate your work.
    data science course in pune

    ReplyDelete
  46. dgfood Admit Card 2021 will be published very soon. From The authority, New notice has been published Submit Applicant Data. Candidates' data submit can via teletalk server- dgfood.teletalk.com.bd.  All applicable candidates must check it.
    All Government Job Circular 2021 Find our website.
    All Source Find- jobnewsbd24.com/sitemap

    ReplyDelete
  47. I want to leave a little comment to support and wish you the best of luck.we wish you the best of luck in all your blogging enedevors.
    data science training in chennai

    ReplyDelete
  48. Such a very useful article. Very interesting to read this article.I would like to thank you for the efforts you had made for writing this awesome article.
    Best Data Science courses in Hyderabad

    ReplyDelete
  49. Thanks for posting the best information and the blog is very important.data science institutes in hyderabad

    ReplyDelete
  50. Interesting post. I Have Been wondering about this issue, so thanks for posting. Pretty cool post.It 's really very nice and Useful post.Thanks
    data scientist course in hyderabad

    ReplyDelete
  51. Wow, happy to see this awesome post. I hope this think help any newbie for their awesome work and by the way thanks for share this awesomeness, i thought this was a pretty interesting read when it comes to this topic. Thank you..

    Data Science Training in Hyderabad

    ReplyDelete
  52. i am glad to discover this page : i have to thank you for the time i spent on this especially great reading !! i really liked each part and also bookmarked you for new information on your site.
    artificial intelligence training in chennai

    ReplyDelete
  53. Good work, unique site and interesting too… keep it up…looking forward for more updates. Good luck to all of you and thanks so much for your hard-work…

    Data Science Training in Hyderabad

    ReplyDelete
  54. I want to leave a little comment to support and wish you the best of luck.we wish you the best of luck in all your blogging enedevors.
    business analytics courses

    ReplyDelete
  55. Thanks for bringing such innovative content which truly attracts the readers towards you. Certainly, your blog competes with your co-bloggers to come up with the newly updated info. Finally, kudos to you.

    Data Science Course in Varanasi

    ReplyDelete
  56. Extremely overall quite fascinating post. I was searching for this sort of data and delighted in perusing this one. Continue posting. A debt of gratitude is in order for sharing. data scientist course in delhi

    ReplyDelete
  57. I want to leave a little comment to support and wish you the best of luck.we wish you the best of luck in all your blogging enedevors.
    aws training in hyderabad

    ReplyDelete
  58. Extremely overall quite fascinating post. I was searching for this sort of data and delighted in perusing this one. Continue posting. A debt of gratitude is in order for sharing. python course in delhi

    ReplyDelete
  59. Very helpful post. Thanks to the author for presenting such a post so simply.

    I also have a blog. Where job and education related content is shared. You can visit when you have time.

    Thanks.

    ReplyDelete
  60. I feel very grateful that I read this. It is very helpful and very informative and I really learned a lot from it.
    real estate company

    ReplyDelete
  61. I was just examining through the web looking for certain information and ran over your blog.It shows how well you understand this subject. Bookmarked this page, will return for extra. data science course in vadodara

    ReplyDelete
  62. Well we really like to visit this site, many useful information we can get here.
    data science training

    ReplyDelete
  63. Your website is very valuable. Thanks for sharing.
    flat for sale

    ReplyDelete
  64. Thanks for posting the best information and the blog is very good.data science course in Lucknow

    ReplyDelete
  65. Extremely overall quite fascinating post. I was searching for this sort of data and delighted in perusing this one. Continue posting. A debt of gratitude is in order for sharing. cloud computing course in bangalore

    ReplyDelete
  66. i am glad to discover this page : i have to thank you for the time i spent on this especially great reading !! i really liked each part and also bookmarked you for new information on your site.
    data engineering course in india

    ReplyDelete
  67. Extremely overall quite fascinating post. I was searching for this sort of data and delighted in perusing this one. Continue posting. A debt of gratitude is in order for sharing. cloud computing course in nagpur

    ReplyDelete
  68. I feel very grateful that I read this. It is very helpful and very informative and I really learned a lot from it.cloud computing course in noida

    ReplyDelete
  69. I want to leave a little comment to support and wish you the best of luck.we wish you the best of luck in all your blogging enedevors
    artificial intelligence training in noida

    ReplyDelete
  70. Extremely overall quite fascinating post. I was searching for this sort of data and delighted in perusing this one. Continue posting. A debt of gratitude is in order for sharing.data science course in bangalore

    ReplyDelete
  71. I want to leave a little comment to support and wish you the best of luck.we wish you the best of luck in all your blogging enedevors
    data science training institute in delhi

    ReplyDelete
  72. I feel very grateful that I read this. It is very helpful and very informative and I really learned a lot from it.best data science course in bhubaneswar

    ReplyDelete
  73. Extremely overall quite fascinating post. I was searching for this sort of data and delighted in perusing this one. Continue posting. A debt of gratitude is in order for sharing.business analytics course in kolhapur

    ReplyDelete
  74. I really appreciate this wonderful post that you have provided for us. I assure this would be beneficial for most of the people.
    data analytics course in hyderabad

    ReplyDelete
  75. Genuinely generally speaking very interesting post. I was looking for such an information and thoroughly enjoyed scrutinizing this one. Keep posting. Thankful for sharing.data science course in bhopal

    ReplyDelete
  76. This is a fabulous post I seen because of offer it. It is really what I expected to see trust in future you will continue in sharing such a mind boggling post data scientist course in mysore

    ReplyDelete
  77. I want to leave a little comment to support and wish you the best of luck.we wish you the best of luck in all your blogging enedevors
    data science training in delhi

    ReplyDelete
  78. Online casino & jackpot | Kadang Pintar
    Kadang Pintar kadangpintar has partnered with SBOBET.com หาเงินออนไลน์ to 인카지노 provide you with the latest slot games, jackpot games, lottery, free games, free slots, and more!

    ReplyDelete
  79. Extremely overall quite fascinating post. I was searching for this sort of data and delighted in perusing this one. Continue posting. A debt of gratitude is in order for sharing.data scientist course in warangal

    ReplyDelete
  80. Extremely overall quite fascinating post. I was searching for this sort of data and delighted in perusing this one.
    Continue posting. A debt of gratitude is in order for sharing.
    data science course in kolhapur

    ReplyDelete
  81. It's late finding this act. At least, it's a thing to be familiar with that such events exist. I agree with your Blog and I will be back to inspect it more in the future so please keep up your act.
    business analytics training in hyderabad

    ReplyDelete
  82. Extraordinary blog went amazed with the content that they have developed in a very descriptive manner. This type of content surely ensures the participants to explore more themselves. Hope you deliver the same near the future as well. Gratitude to the blogger for the efforts.

    Data Science Training

    ReplyDelete
  83. Stupendous blog huge applause to the blogger and hoping you to come up with such an extraordinary content in future. Surely, this post will inspire many aspirants who are very keen in gaining the knowledge. Expecting many more contents with lot more curiosity further.

    Data Science Certification in Bhilai

    ReplyDelete
  84. Really an awesome blog and very useful information for many people. Keep sharing more blogs again soon. Thank you.
    Data Science Training in Hyderabad

    ReplyDelete
  85. I want to leave a little comment to support and wish you the best of luck.we wish you the best of luck in all your blogging enedevors
    data science course in trivandrum

    ReplyDelete
  86. It's late finding this act. At least, it's a thing to be familiar with that there are such events exist. I agree with your Blog and I will be back to inspect it more in the future so please keep up your act. data analytics certification malaysia

    ReplyDelete
  87. Truly quite fascinating post. I was searching for this sort of data and delighted in perusing this one. Continue to post. Much obliged for sharing.data science course in bhubaneswar

    ReplyDelete
  88. I want to leave a little comment to support and wish you the best of luck.we wish you the best of luck in all your blogging enedevors
    data science training in varanasi

    ReplyDelete
  89. Amazingly by and large very interesting post. I was looking for such an information and thoroughly enjoyed examining this one. Keep posting.
    An obligation of appreciation is all together for sharing.data analytics course in gwalior

    ReplyDelete
  90. I want to leave a little comment to support and wish you the best of luck.we wish you the best of luck in all your blogging enedevors
    data scientist course in trivandrum

    ReplyDelete
  91. Genuinely very charming post. I was looking for such an information and thoroughly enjoyed examining this one. Keep on posting. An obligation of appreciation is for sharing.best data science course in bhubaneswar

    ReplyDelete
  92. I want to leave a little comment to support and wish you the best of luck.we wish you the best of luck in all your blogging enedevors
    data scientist course in varanasi

    ReplyDelete
  93. Everything is very open with a clear clarification of the issues. It was truly informative. Your site is useful. Thank you for sharing!|data analytics course in jodhpur

    ReplyDelete
  94. I’m excited to uncover this page. I need to to thank you for ones time for this particularly fantastic read!! I definitely really liked every part of it and i also have you saved to fav to look at new information in your site. PMP Course

    ReplyDelete
  95. Government of the People’s Republic of Bangladesh, Directorate of Primary Education (DPE), is going to announce PSC Result 2022 in DPE Result 2022 student wide on 30th December 2022 for all divisional Grade 5 exam result with Ebtedayee Result 2022 for annual final terminal examinations, The Primary School Certificate Examination Result 2022 will be announced for both of General and Madhrsah students in division wise to all education board known as Prathomik Somaponi Result 2022.

    ReplyDelete
  96. NCERT Class 12 English Solutions 2023 2023 for Answers & Questions 2023 in Available in Subject Wise Chapter Wise Pdf format Complete Book Solutions 2023. The Solutions 2023 here are as per the current Academic year ready to CBSE. to make it easy and Convenient for you, here is a simplified way to read NCERT English Solutions 2023 for Class 12 Chapter wise Online Download. NCERT English Solutions 2023 for 12th ClassAccountancy Solutions 2023 Very Important Home Work & Final Examination Students Better Performance NCERT 12 Class Solutions 2023 2023 for English Download Pdf Format Chapter Wise and Work Start Easy two Pass 12th Class grad 10 Points.

    ReplyDelete
  97. Himachal Pradesh 6th, 7th, 8th, 9th, 10th Book 2023 is Available here for Free Download, We are Providing the Chapter-wise Links which can be Downloaded as Pdf which Students may Refer whenever Required. This is the Latest Edition of has been Published by the Himachal Pradesh Board of School Education is Agency Government of Himachal Pradesh entrusted with the Responsibilities of Prescribing Courses of instructions and Textbook for Secondary School Students in Himachal Pradesh. HPBOSE 10th Class Textbook Students may also check here the HP Board 6th, 7th, 8th, 9th, 10th Books 2023 Prepared by Subject Experts. This Study Materiel For a better understanding of concepts used in 6th, 7th, 8th, 9th, 10th, Download the Book From the Link Provided at the end of this article.

    ReplyDelete
  98. It is very interesting! Really useful for me and thank you for this amazing blog.
    VA Divorce Lawyers
    How to get a Divorce in VA

    ReplyDelete

© Chutium / Teng Qiu @ ABC Netz Group